Page 1 of 5

Null-move pruning and the hash table

Posted: Sun Mar 28, 2010 10:21 am
by micron
I just implemented null-move pruning in Spandrel. Still glowing with newbie delight at the speedup (after tuning, a nominal 10 ply search from the start went from 7.5 to 1.0 s) I have two questions:

[1] Apparently a TT probe can return advice that a null-move is not worth attempting. How does that work?

[2] When the null-move produces a cutoff, it is tempting to hash the node/score in the TT. But doing this gives me a slowdown of 2--3 X, with many more TT misses. Is it a Bad Idea to hash these nodes, or am I doing something wrong? The score is saved as a LOWER_BOUND.

Robert P.

Re: Null-move pruning and the hash table

Posted: Sun Mar 28, 2010 10:52 am
by Mincho Georgiev
Hi Robert!

[1] "advice that null move is not worth attempting" is not the correct interpretation, it's more likely the stored indicator to point that null move is absolutely forbidden instead, since the previous move was null move, and you are about to make 2 nulls in a row, which is undesirable at least.

[2] There are a lot of different implementations on that subject. I don't store it, since there is no gain in my impl., others do. Regardless, 2-3x slowdown is absolutely huge, and probably something is wrong indeed.

Re: Null-move pruning and the hash table

Posted: Sun Mar 28, 2010 11:59 am
by Teemu Pudas
[1] if (tt_depth >= depth - R - 1 && tt_score < beta && tt_type & UPPER) skip_null(); // If a plain reduced search without nullmove won't produce a cutoff...

[2] You have a bug.

Re: Null-move pruning and the hash table

Posted: Sun Mar 28, 2010 12:35 pm
by micron
Mincho Georgiev wrote: [1] "advice that null move is not worth attempting" is not the correct interpretation, it's more likely the stored indicator to point that null move is absolutely forbidden instead, since the previous move was null move, and you are about to make 2 nulls in a row, which is undesirable at least.

[2] There are a lot of different implementations on that subject. I don't store it, since there is no gain in my impl., others do. Regardless, 2-3x slowdown is absolutely huge, and probably something is wrong indeed.
[1] I specifically allow 2 (but of course not 3) nulls in a row (e.g. white at ply n then black at ply n+1), because it proved faster. How is this "undesirable at least"?
[2] I find it all too easy to believe that something is "wrong indeed". Thanks, Mincho.
Robert P.

Re: Null-move pruning and the hash table

Posted: Sun Mar 28, 2010 1:00 pm
by micron
Teemu Pudas wrote:[1] if (tt_depth >= depth - R - 1 && tt_score < beta && tt_type & UPPER) skip_null(); // If a plain reduced search without nullmove won't produce a cutoff...

[2] You have a bug.
[1] Ah, thank you, that seems to predict reliably that null-move won't cutoff and therefore should not be attempted. In my program, though, it turns out to save a negligible number of attempts.
Without skip_null: 45366 null-moves 36240 cutoffs (79%)
With skip_null: 45304 null moves 36240 cutoffs (79%)

[2] Good, if I can fix the bug in the TT save, null-move pruning should produce even better results...

Robert P.

Re: Null-move pruning and the hash table

Posted: Sun Mar 28, 2010 1:05 pm
by michiguel
micron wrote:I just implemented null-move pruning in Spandrel. Still glowing with newbie delight at the speedup (after tuning, a nominal 10 ply search from the start went from 7.5 to 1.0 s) I have two questions:

[1] Apparently a TT probe can return advice that a null-move is not worth attempting. How does that work?

[2] When the null-move produces a cutoff, it is tempting to hash the node/score in the TT. But doing this gives me a slowdown of 2--3 X, with many more TT misses. Is it a Bad Idea to hash these nodes, or am I doing something wrong? The score is saved as a LOWER_BOUND.

Robert P.
Spandrel is a gould name!

Miguel
PS: architect or evolutionary biologist?

Re: Null-move pruning and the hash table

Posted: Sun Mar 28, 2010 1:11 pm
by Mincho Georgiev
micron wrote:
Mincho Georgiev wrote: [1] "advice that null move is not worth attempting" is not the correct interpretation, it's more likely the stored indicator to point that null move is absolutely forbidden instead, since the previous move was null move, and you are about to make 2 nulls in a row, which is undesirable at least.

[2] There are a lot of different implementations on that subject. I don't store it, since there is no gain in my impl., others do. Regardless, 2-3x slowdown is absolutely huge, and probably something is wrong indeed.
[1] I specifically allow 2 (but of course not 3) nulls in a row (e.g. white at ply n then black at ply n+1), because it proved faster. How is this "undesirable at least"?
[2] I find it all too easy to believe that something is "wrong indeed". Thanks, Mincho.
Robert P.

Recursive null move pruning is widely spread scheme. This means, that
more than one null moves are allowed in one branch (line) of the search.
However, this is absolutely different from making 2/3 null moves in a row.
The last brings a huge risk of tactical blunders and it may cause the search to assume that position is good and cuts off, when actually could be very bad instead.

Re: Null-move pruning and the hash table

Posted: Sun Mar 28, 2010 1:43 pm
by hgm
Mincho Georgiev wrote:Recursive null move pruning is widely spread scheme. This means, that
more than one null moves are allowed in one branch (line) of the search.
However, this is absolutely different from making 2/3 null moves in a row.
The last brings a huge risk of tactical blunders and it may cause the search to assume that position is good and cuts off, when actually could be very bad instead.
Why is that? I don't pay any attention to the number of null moves in a row in micro-Max, and found never any adverse effects of it. I would say at the very worst it could be a (small) waste of time. Two null moves in a row cannot both fail high. For the first to fail high, it is necessary that the second fails low, and then it would be like you had not searched it at all, as a low-failing null move does not affect the search or score. You only prune when it fails high. And if the second null move fails high, making the first fail low, where it otherwise would have failed high, it merely means that you are in zugzwang, and that the search after the two null moves acted as a verification search that detected that (despite of its high reduction), so that you don't trust the null-move, and do a normal search in stead.

Re: Null-move pruning and the hash table

Posted: Sun Mar 28, 2010 5:20 pm
by bob
micron wrote:I just implemented null-move pruning in Spandrel. Still glowing with newbie delight at the speedup (after tuning, a nominal 10 ply search from the start went from 7.5 to 1.0 s) I have two questions:

[1] Apparently a TT probe can return advice that a null-move is not worth attempting. How does that work?

[2] When the null-move produces a cutoff, it is tempting to hash the node/score in the TT. But doing this gives me a slowdown of 2--3 X, with many more TT misses. Is it a Bad Idea to hash these nodes, or am I doing something wrong? The score is saved as a LOWER_BOUND.

Robert P.
The idea is that the hash table can show that a null-move won't fail high. Works like this:

Your current depth remaining is N. Let's choose 8 here. Your current alpha=0.20, and beta=0.21 (you choose the values, doesn't matter here).

You do a hash probe to see if you can terminate the search. You get a hit, but the draft (remaining depth for the hash entry) is only 7, so you can't use that result to end the search. But if you compare the table result and discover that it suggests that a normal search here will not fail high (entry is of type UPPER, or EXACT where table bound / value is < beta) you can make a pretty good guess that a null-move search will also not fail high, since doing nothing is worse than playing a real move, and the best real move you could find did not fail high when you searched it to store this hash entry. All that you need to test is that (a) the table draft is at least greater than depth - R - 1 (where R is normal null-move depth reduction amount) and that the table value suggests that the score is < beta.

All you do is avoid doing nulls when they will not help, it is a very slight performance improvement.

Re: Null-move pruning and the hash table

Posted: Sun Mar 28, 2010 5:59 pm
by Mincho Georgiev
I guess,I've misunderstood the subject. We are talking about two different things here, sorry if I mislead you, Robert P..
What I've pointed out was the null move indicator that I use, not the method that Bob Hyatt just describes.
H.G., what you are saying sounds reasonable, but I still don't understand how that could act as verification. I've only tried so far the verified null move pruning, where after cut-off, the search continues with reduced depth.
In a meantime, I've decided to run a test with Pawny - with/without null moves in a row. I will post the results later.