Extended Null-Move Reductions

Discussion of chess software programming and technical issues.

Moderators: hgm, Rebel, chrisw

jwes
Posts: 778
Joined: Sat Jul 01, 2006 7:11 am

Re: Many thanks for the testing (NT) another result.

Post by jwes »

bob wrote:
Don wrote: When you say not allowing consecutive null moves do you mean first for white, then for black?
I mean two consecutive plies.
This gives me an idea. If you did verified null move only when this is the second consecutive null move for this side, it would greatly reduce the cost of verification and should still catch most zugzwangs.
User avatar
Don
Posts: 5106
Joined: Tue Apr 29, 2008 4:27 pm

Re: Many thanks for the testing (NT) another result.

Post by Don »

jwes wrote:
bob wrote:
Don wrote: When you say not allowing consecutive null moves do you mean first for white, then for black?
I mean two consecutive plies.
This gives me an idea. If you did verified null move only when this is the second consecutive null move for this side, it would greatly reduce the cost of verification and should still catch most zugzwangs.
Such a thing would not work in Komodo but might in other programs. Komodo does not do null move if the score is above beta so it can never do consecutive null moves.

My experience has been that in most cases, if something doesn't work limited versions of it also will not work. For example an extension often makes the program stronger at fixed depth, but slows it down and in the end makes the program slightly weaker. Doing less of those will probably split the difference, still giving you a program that is slightly weaker. You have to find something that is a good tradeoff - if you can do less of these extensions by being a bit smarter about which ones you do, then you might realize a gain.

But I really like the idea of "extended null move pruning" as presented in the paper. But if it cannot be demonstrated to be an actual improvement ...
User avatar
michiguel
Posts: 6401
Joined: Thu Mar 09, 2006 8:30 pm
Location: Chicago, Illinois, USA

Re: Many thanks for the testing (NT) another result.

Post by michiguel »

Don wrote:
jwes wrote:
bob wrote:
Don wrote: When you say not allowing consecutive null moves do you mean first for white, then for black?
I mean two consecutive plies.
This gives me an idea. If you did verified null move only when this is the second consecutive null move for this side, it would greatly reduce the cost of verification and should still catch most zugzwangs.
Such a thing would not work in Komodo but might in other programs. Komodo does not do null move if the score is above beta so it can never do consecutive null moves.
But you have a side-to-move bonus, IIRC, right? with that it is possible to do two consecutive null moves unless you take that bonus into account when you compare score and beta.

Miguel

My experience has been that in most cases, if something doesn't work limited versions of it also will not work. For example an extension often makes the program stronger at fixed depth, but slows it down and in the end makes the program slightly weaker. Doing less of those will probably split the difference, still giving you a program that is slightly weaker. You have to find something that is a good tradeoff - if you can do less of these extensions by being a bit smarter about which ones you do, then you might realize a gain.

But I really like the idea of "extended null move pruning" as presented in the paper. But if it cannot be demonstrated to be an actual improvement ...
User avatar
Don
Posts: 5106
Joined: Tue Apr 29, 2008 4:27 pm

Re: Many thanks for the testing (NT) another result.

Post by Don »

michiguel wrote:
Don wrote:
jwes wrote:
bob wrote:
Don wrote: When you say not allowing consecutive null moves do you mean first for white, then for black?
I mean two consecutive plies.
This gives me an idea. If you did verified null move only when this is the second consecutive null move for this side, it would greatly reduce the cost of verification and should still catch most zugzwangs.
Such a thing would not work in Komodo but might in other programs. Komodo does not do null move if the score is above beta so it can never do consecutive null moves.
But you have a side-to-move bonus, IIRC, right? with that it is possible to do two consecutive null moves unless you take that bonus into account when you compare score and beta.

Miguel
Yes, you are right, I do have a side to move bonus so I guess in some cases I am doing multiple null move.

My experience has been that in most cases, if something doesn't work limited versions of it also will not work. For example an extension often makes the program stronger at fixed depth, but slows it down and in the end makes the program slightly weaker. Doing less of those will probably split the difference, still giving you a program that is slightly weaker. You have to find something that is a good tradeoff - if you can do less of these extensions by being a bit smarter about which ones you do, then you might realize a gain.

But I really like the idea of "extended null move pruning" as presented in the paper. But if it cannot be demonstrated to be an actual improvement ...
bob
Posts: 20943
Joined: Mon Feb 27, 2006 7:30 pm
Location: Birmingham, AL

Re: Many thanks for the testing (NT) another result.

Post by bob »

jwes wrote:
bob wrote:
Don wrote: When you say not allowing consecutive null moves do you mean first for white, then for black?
I mean two consecutive plies.
This gives me an idea. If you did verified null move only when this is the second consecutive null move for this side, it would greatly reduce the cost of verification and should still catch most zugzwangs.
I don't see how. The entire idea behind doing the second consecutive null-move search is to catch the cases where zugzwang is an issue. The second null-move is either pure overhead, because it fails low if the original null-move really should fail high, or the second fails-high if the first should not, which is what would certainly happen in many zugzwang-type positions. Doing a verification search on top of all that would seem to do nothing more than add more search overhead.
jwes
Posts: 778
Joined: Sat Jul 01, 2006 7:11 am

Re: Many thanks for the testing (NT) another result.

Post by jwes »

bob wrote:
jwes wrote:
bob wrote:
Don wrote: When you say not allowing consecutive null moves do you mean first for white, then for black?
I mean two consecutive plies.
This gives me an idea. If you did verified null move only when this is the second consecutive null move for this side, it would greatly reduce the cost of verification and should still catch most zugzwangs.
I don't see how. The entire idea behind doing the second consecutive null-move search is to catch the cases where zugzwang is an issue. The second null-move is either pure overhead, because it fails low if the original null-move really should fail high, or the second fails-high if the first should not, which is what would certainly happen in many zugzwang-type positions. Doing a verification search on top of all that would seem to do nothing more than add more search overhead.
I guess I am not explaining myself very well. I mean where one side moves three times in a row, or two null moves for the other side, e.g. in the opening position:
e4, null, d4, null, verify null (if somehow white failed low on the second null move.
bob
Posts: 20943
Joined: Mon Feb 27, 2006 7:30 pm
Location: Birmingham, AL

Re: Many thanks for the testing (NT) another result.

Post by bob »

jwes wrote:
bob wrote:
jwes wrote:
bob wrote:
Don wrote: When you say not allowing consecutive null moves do you mean first for white, then for black?
I mean two consecutive plies.
This gives me an idea. If you did verified null move only when this is the second consecutive null move for this side, it would greatly reduce the cost of verification and should still catch most zugzwangs.
I don't see how. The entire idea behind doing the second consecutive null-move search is to catch the cases where zugzwang is an issue. The second null-move is either pure overhead, because it fails low if the original null-move really should fail high, or the second fails-high if the first should not, which is what would certainly happen in many zugzwang-type positions. Doing a verification search on top of all that would seem to do nothing more than add more search overhead.
I guess I am not explaining myself very well. I mean where one side moves three times in a row, or two null moves for the other side, e.g. in the opening position:
e4, null, d4, null, verify null (if somehow white failed low on the second null move.
That is not quite what I thought. I thought you meant <null> <null> and if the second <null> fails high, do a verification. Or either you could have meant if the first <null> failed high, do a verification but after the second null.

I would not do both. The second null is supposed to be a solution to zugzwang for the first null-move. As is the verification search whenever the first fails high by itself. Doing both seems to be wasting search effort, even though it is not huge effort.

The case originally mentioned is the double-null case, of <null> <null> at two consecutive plies (Vincent's idea). I have tried it and it tested a couple of Elo weaker than not allowing two consecutive nulls. I also tested various incantations of the verification search (only near the root, everywhere, etc) with no good effect. I tried allowing nulls in kp endings (no pieces) per your suggestion, but again, it was slightly worse... But not by much. But then again, allowing nulls everywhere is not much worse than disallowing consecutive nulls, and your approach simply restricts that a bit which means less loss. But not zero loss however.