I did an interesting experiment.
I created a python program that generates a random tree.
I search this tree using PVS but at predicted CUT nodes I do a "fake LMR search" (no LMR at predicted ALL nodes). The fake LMR "search" just generates a random number.
The result of the thus perturbed PVS search is the same as the unperturbed one!
I think this is additional evidence for the fact that you can do very aggressive LMR at predicted CUT nodes but not at predicted ALL nodes. I think this has been more or less proved here but some people do not believe it.
Here is the python program in case you want to repeat the experiment
http://hardy.uhasselt.be/Toga/random_LM ... T_nodes.py
LMR at CUT nodes can be arbitrarily bad!
Moderators: hgm, Rebel, chrisw
-
- Posts: 6401
- Joined: Thu Mar 09, 2006 8:30 pm
- Location: Chicago, Illinois, USA
Re: LMR at CUT nodes can be arbitrarily bad!
Just to clarify, you mean "LMR at CUT nodes can be arbitrarily bad without hurting". Right?Michel wrote:I did an interesting experiment.
I created a python program that generates a random tree.
I search this tree using PVS but at predicted CUT nodes I do a "fake LMR search" (no LMR at predicted ALL nodes). The fake LMR "search" just generates a random number.
The result of the thus perturbed PVS search is the same as the unperturbed one!
I think this is additional evidence for the fact that you can do very aggressive LMR at predicted CUT nodes but not at predicted ALL nodes. I think this has been more or less proved here but some people do not believe it.
Here is the python program in case you want to repeat the experiment
http://hardy.uhasselt.be/Toga/random_LM ... T_nodes.py
Another way to look at your experiment is that you randomnly determined if a given move in a CUT node will be re-searched w/o reduction or completely skipped (random number >= beta or < beta). A the end, if all of them were skipped, the previous node re-searches the whole thing or not? It seems it won't since you did not reduce at all nodes.
Miguel
-
- Posts: 2272
- Joined: Mon Sep 29, 2008 1:50 am
Re: LMR at CUT nodes can be arbitrarily bad!
Yes. I mean without changing the score computed for the tree. Of course it wil hurt timewise by the researches that are triggered.Just to clarify, you mean "LMR at CUT nodes can be arbitrarily bad without hurting". Right?
If the fake LMR score cause a fail high then it is immediately researched as in standard LMR.Another way to look at your experiment is that you randomnly determined if a given move in a CUT node will be re-searched w/o reduction or completely skipped (random number >= beta or < beta). A the end, if all of them were skipped, the previous node re-searches the whole thing or not? It seems it won't since you did not reduce at all nodes.
If the node fails low and the score influences the value of the tree then the node will be ultimately researched by the PVS mechanism (which changes the CUT node into an ALL node).
-
- Posts: 4185
- Joined: Tue Mar 14, 2006 11:34 am
- Location: Ethiopia
Re: LMR at CUT nodes can be arbitrarily bad!
Michel, I am also not sure about the object or conclusion of the test like Miguel. So a detailed explanation is appreciated. I just looked at your code but i will give my opinions when i understood the code and your explanations.
One thing that came to my mind immediately is that minimaxing on a random tree converges to a certain value P* as the tree go deeper. Check Pearl,1981. So I wonder if that is playing a role here too.
The other thing is have you done the same on ALL nodes only? The only way to prove it would be to show same thing doesn't work when applied on ALL nodes only.
Like I said I just saw your post, but i am sure you would not have gone to this trouble for nothing
Daniel
One thing that came to my mind immediately is that minimaxing on a random tree converges to a certain value P* as the tree go deeper. Check Pearl,1981. So I wonder if that is playing a role here too.
The other thing is have you done the same on ALL nodes only? The only way to prove it would be to show same thing doesn't work when applied on ALL nodes only.
Like I said I just saw your post, but i am sure you would not have gone to this trouble for nothing
Daniel
-
- Posts: 2272
- Joined: Mon Sep 29, 2008 1:50 am
Re: LMR at CUT nodes can be arbitrarily bad!
Yes I tried it. It does not work with ALL nodes.The other thing is have you done the same on ALL nodes only? The only way to prove it would be to show same thing doesn't work when applied on ALL nodes only.
I did not know this result, but I observed it while tesing. However I don't think it plays a role here since I am not discussing an asymptotic result.One thing that came to my mind immediately is that minimaxing on a random tree converges to a certain value P* as the tree go deeper. Check Pearl,1981. So I wonder if that is playing a role here too.
Let me point out one important thing though. To make pvs immune to bad LMR at CUT nodes one should NOT use the result of the scout search to raise alpha at PV nodes. But it is well known that this increases search instability so it is usually not done anyway.
This is commented on in the python script.
-
- Posts: 2272
- Joined: Mon Sep 29, 2008 1:50 am
Re: LMR at CUT nodes can be arbitrarily bad!
Silent update of script at same URL.
The LMR "research" was done with the wrong bounds.
The result seems to remain the same.
The LMR "research" was done with the wrong bounds.
The result seems to remain the same.
-
- Posts: 4185
- Joined: Tue Mar 14, 2006 11:34 am
- Location: Ethiopia
Re: LMR at CUT nodes can be arbitrarily bad!
Seems legit except for that last point of not setting alpha=s. With that it breaks down at CUT nodes too like it does for LMR at ALL nodes. If you do LMR to both types at the same time, the score will be different anyway so there would be nothing to prove.Michel wrote:Yes I tried it. It does not work with ALL nodes.The other thing is have you done the same on ALL nodes only? The only way to prove it would be to show same thing doesn't work when applied on ALL nodes only.I did not know this result, but I observed it while tesing. However I don't think it plays a role here since I am not discussing an asymptotic result.One thing that came to my mind immediately is that minimaxing on a random tree converges to a certain value P* as the tree go deeper. Check Pearl,1981. So I wonder if that is playing a role here too.
Let me point out one important thing though. To make pvs immune to bad LMR at CUT nodes one should NOT use the result of the scout search to raise alpha at PV nodes. But it is well known that this increases search instability so it is usually not done anyway.
.
I just realized the LMR is not recursive. Would that have any effect because right now researches of reduced depth are done only once.
-
- Posts: 4185
- Joined: Tue Mar 14, 2006 11:34 am
- Location: Ethiopia
Re: LMR at CUT nodes can be arbitrarily bad!
I belive this is a pecularity of PVS because with the modified alpha-beta it fails at CUT nodes too. This is pretty much like the scenario I encountered some time ago where using PV[N] is enough to store the pv instead of pv[N][N]. The reason why we needed the latter was because of search instabilities. So I guess this will be affected the same way when you add transposition table,null move etc. I am not so sure anymore it is the node types.
Modified alpha-beta:
Modified alpha-beta:
Code: Select all
def alphabeta(self,alpha,beta,nodetype):
global count
count+=1
if(self.d==0):
return self.value
else:
for i in xrange(0,self.w):
if nodetype==PV and i==0:
nodetype_child=PV
elif nodetype==PV or nodetype==ALL:
nodetype_child=CUT
else:
nodetype_child=ALL
if nodetype==CUT:
s=rand() # fake result of LMR search
if s>alpha:
s=-self.tree[i].alphabeta(-beta,-alpha,nodetype_child)
else:
s=-self.tree[i].alphabeta(-beta,-alpha,nodetype_child)
if s>=beta:
return s
elif s>alpha:
alpha=s
return alpha
-
- Posts: 2272
- Joined: Mon Sep 29, 2008 1:50 am
Re: LMR at CUT nodes can be arbitrarily bad!
Yes definitely. It is a property of PVS. It does not work for alpha beta. alpha beta does not have the required researches.I belive this is a pecularity of PVS because with the modified alpha-beta it fails at CUT nodes too.
For the transposition table you have to add a flag which gives the nodetype corresponding to a score. You cannot use an upper bound from a CUT node in an ALL node and similarly you cannot use a lower bound from an ALL node in a CUT node (this idea comes from IPPOLIT).So I guess this will be affected the same way when you add transposition table,null move etc.
Nullmove is just a special type of move, so I do not think it is relevant.
-
- Posts: 4185
- Joined: Tue Mar 14, 2006 11:34 am
- Location: Ethiopia
Re: LMR at CUT nodes can be arbitrarily bad!
I mean any search instability which gives different results when searched again. So those hacks in the TT are not enough and nullmove did show differences to the use of pv[N] nad pv[N][N] so it does bring instabilities. Also to make things work at an ALL node , we can re-search when it fails low instead of fail high like below. Infact now that I think about it we should do researches for LMR when we get results similar to expected node type.Michel wrote:Yes definitely. It is a property of PVS. It does not work for alpha beta. alpha beta does not have the required researches.I belive this is a pecularity of PVS because with the modified alpha-beta it fails at CUT nodes too.
For the transposition table you have to add a flag which gives the nodetype corresponding to a score. You cannot use an upper bound from a CUT node in an ALL node and similarly you cannot use a lower bound from an ALL node in a CUT node (this idea comes from IPPOLIT).So I guess this will be affected the same way when you add transposition table,null move etc.
Nullmove is just a special type of move, so I do not think it is relevant.
Code: Select all
expected CUT node -> research when it fails HIGH
expected ALL node -> research when it fails LOW
Code: Select all
if nodetype==ALL:
s=rand()
if s <= alpha: #It was s > alpha before
s=-self.tree[i].pvs_LMR(-beta,-alpha,nodetype_child)