Stockfish null move pre-condition

Discussion of chess software programming and technical issues.

Moderators: hgm, Rebel, chrisw

Rein Halbersma
Posts: 741
Joined: Tue May 22, 2007 11:13 am

Stockfish null move pre-condition

Post by Rein Halbersma »

I wonder whether the pre-conditions in the current Stockfish (1.8) code for a null move search cannot be sharpened a bit.

If we get to the point of trying a null move search, the TT pruning code above it didn't produce a cutoff. In particular, it didn't fail low. One reason it didn't could be that the score had a lower bound below beta, but that the entry's stored depth wasn't sufficient at the current search depth.

However, it might still be true that the entry's stored depth is above depth - 5. In that case, after a null move search that fails high, the following verification search at that same depth - 5, will produce a fail low when it reaches the TT pruning code.

In other words, before trying the null move search, we could check whether the TT entry has a depth at least equal to depth - 5 and an upper bound below the current beta. In code:

Code: Select all

    if (   !PvNode
        && !ss->skipNullMove
        &&  depth > OnePly
        &&  refinedValue >= beta - (depth >= 4 * OnePly ? NullMoveMargin : 0)
        && !isCheck
        && !value_is_mate(beta)
        &&  pos.non_pawn_material(pos.side_to_move())
        // check whether an eventual verification search would not fail low
        && !(depth >= 6 * OnePly && tte && fail_low_from_TT(tte, depth - 5 * OnePly, beta, ply)))
    {
        // do usual null move stuff here
    }
bob
Posts: 20943
Joined: Mon Feb 27, 2006 7:30 pm
Location: Birmingham, AL

Re: Stockfish null move pre-condition

Post by bob »

Rein Halbersma wrote:I wonder whether the pre-conditions in the current Stockfish (1.8) code for a null move search cannot be sharpened a bit.

If we get to the point of trying a null move search, the TT pruning code above it didn't produce a cutoff. In particular, it didn't fail low. One reason it didn't could be that the score had a lower bound below beta, but that the entry's stored depth wasn't sufficient at the current search depth.

However, it might still be true that the entry's stored depth is above depth - 5. In that case, after a null move search that fails high, the following verification search at that same depth - 5, will produce a fail low when it reaches the TT pruning code.

In other words, before trying the null move search, we could check whether the TT entry has a depth at least equal to depth - 5 and an upper bound below the current beta. In code:

Code: Select all

    if (   !PvNode
        && !ss->skipNullMove
        &&  depth > OnePly
        &&  refinedValue >= beta - (depth >= 4 * OnePly ? NullMoveMargin : 0)
        && !isCheck
        && !value_is_mate(beta)
        &&  pos.non_pawn_material(pos.side_to_move())
        // check whether an eventual verification search would not fail low
        && !(depth >= 6 * OnePly && tte && fail_low_from_TT(tte, depth - 5 * OnePly, beta, ply)))
    {
        // do usual null move stuff here
    }
That's a common trick I thought everyone used. That was explained by Hsu/Campbell in some paper they wrote a long time ago...
Ralph Stoesser
Posts: 408
Joined: Sat Mar 06, 2010 9:28 am

Re: Stockfish null move pre-condition

Post by Ralph Stoesser »

Good idea, but in SF this trick has a potentially unwanted side effect.

If you do not enter the null move search code because you know that your TT entry would not let you verify a null move fail high, you also skip the code from null move fail low:

- correction of too aggressive LMR from the previous ply
- store the threat move for later use in futility pruning
mcostalba
Posts: 2684
Joined: Sat Jun 14, 2008 9:17 pm

Re: Stockfish null move pre-condition

Post by mcostalba »

With following probing code:

Code: Select all

    if (   !PvNode
        && !ss->skipNullMove
        &&  depth > OnePly
        &&  refinedValue >= beta - (depth >= 4 * OnePly ? NullMoveMargin : 0)
        && !isCheck
        && !value_is_mate(beta)
        &&  pos.non_pawn_material(pos.side_to_move()))
    {
        ss->currentMove = MOVE_NULL;

        bool skip =   depth >= 6 * OnePly
                   && tte
                   && ok_to_use_TT(tte, depth - 5 * OnePly, beta, ply)
                   && value_from_tt&#40;tte->value&#40;), ply&#41; < beta;

        dbg_hit_on&#40;skip&#41;;

        // Null move dynamic reduction based on depth
        int R = 3 + &#40;depth >= 5 * OnePly ? depth / 8 &#58; 0&#41;;
I have 11658 hits on 2329648 probes so about 0,5% of cases. But is even smaller the case of such nodes to fail high after a null move (so that we perform a verification on them): on 116227 fail-highs of null search only 132 belong to cases where 'skip' variable was true (0,1%)

So I think this trick really doesn't change anything and because is a complication of code it won't go in ;-)
Last edited by mcostalba on Fri Jul 23, 2010 3:08 am, edited 1 time in total.
bob
Posts: 20943
Joined: Mon Feb 27, 2006 7:30 pm
Location: Birmingham, AL

Re: Stockfish null move pre-condition

Post by bob »

Ralph Stoesser wrote:Good idea, but in SF this trick has a potentially unwanted side effect.

If you do not enter the null move search code because you know that your TT entry would not let you verify a null move fail high, you also skip the code from null move fail low:

- correction of too aggressive LMR from the previous ply
- store the threat move for later use in futility pruning
Have not looked at their code, but the idea itself is sound. If the TT shows that the null-move is guaranteed to fail low, then there is no point in doing it. If you do something in that case, it obviously ought to be done when the TT says it will fail low...
Rein Halbersma
Posts: 741
Joined: Tue May 22, 2007 11:13 am

Re: Stockfish null move pre-condition

Post by Rein Halbersma »

mcostalba wrote: I have 11658 hits on 2329648 probes so about 0,5% of cases. But is even smaller the case of such nodes to fail high after a null move (so that we perform a verification on them): on 116227 fail-highs of null search only 132 belong to cases where 'skip' variable was true (0,1%)

So I think this trick really doesn't change anything and because is a complication of code it won't go in ;-)
I guess your numbers reflect the fact that the TT value is already included in refinedValue, which is not too much below beta. So refinedValue must lie in the interval [beta - Nullmargin, infinity], and the numbers say that refinedValue falls with 99,5% probability in the [beta, infinity] part of that range. Makes sense not to include the extra test for such a small exception. Thanks anyway for checking it.
bob
Posts: 20943
Joined: Mon Feb 27, 2006 7:30 pm
Location: Birmingham, AL

Re: Stockfish null move pre-condition

Post by bob »

mcostalba wrote:With following probing code:

Code: Select all

    if (   !PvNode
        && !ss->skipNullMove
        &&  depth > OnePly
        &&  refinedValue >= beta - &#40;depth >= 4 * OnePly ? NullMoveMargin &#58; 0&#41;
        && !isCheck
        && !value_is_mate&#40;beta&#41;
        &&  pos.non_pawn_material&#40;pos.side_to_move&#40;)))
    &#123;
        ss->currentMove = MOVE_NULL;

        bool skip =   depth >= 6 * OnePly
                   && tte
                   && ok_to_use_TT&#40;tte, depth - 5 * OnePly, beta, ply&#41;
                   && value_from_tt&#40;tte->value&#40;), ply&#41; < beta;

        dbg_hit_on&#40;skip&#41;;

        // Null move dynamic reduction based on depth
        int R = 3 + &#40;depth >= 5 * OnePly ? depth / 8 &#58; 0&#41;;
I have 11658 hits on 2329648 probes so about 0,5% of cases. But is even smaller the case of such nodes to fail high after a null move (so that we perform a verification on them): on 116227 fail-highs of null search only 132 belong to cases where 'skip' variable was true (0,1%)

So I think this trick really doesn't change anything and because is a complication of code it won't go in ;-)
It will make measurably smaller trees at no cost. How much smaller depends on lots of things, but avoiding a wasted null-move search can't possibly be bad, and it doesn't add more than a line of code, which is hardly a huge increase in complexity.
Ralph Stoesser
Posts: 408
Joined: Sat Mar 06, 2010 9:28 am

Re: Stockfish null move pre-condition

Post by Ralph Stoesser »

bob wrote: It will make measurably smaller trees at no cost. How much smaller depends on lots of things, but avoiding a wasted null-move search can't possibly be bad, and it doesn't add more than a line of code, which is hardly a huge increase in complexity.
Not so clear for SF 1.8. At some depths it makes the trees smaller, at other depths it makes the trees bigger. Measured with SF bench for depths 12, 14, 16, 18 and 20.
mcostalba
Posts: 2684
Joined: Sat Jun 14, 2008 9:17 pm

Re: Stockfish null move pre-condition

Post by mcostalba »

Ralph Stoesser wrote:
bob wrote: It will make measurably smaller trees at no cost. How much smaller depends on lots of things, but avoiding a wasted null-move search can't possibly be bad, and it doesn't add more than a line of code, which is hardly a huge increase in complexity.
Not so clear for SF 1.8. At some depths it makes the trees smaller, at other depths it makes the trees bigger. Measured with SF bench for depths 12, 14, 16, 18 and 20.
Of course. The change is so small and so stupid that what you measure is almost pure random noise: if you take 2 engine versions A and B that differ only for a small thing that gives no ELO at all you'll measure anyway that on some positions A will produce a smaller tree then B (sometime the opposite) and some positions even a much smaller tree.

Once I experienced this because due to a mistake instead of

Code: Select all

const Score TempoValue = make_score&#40;48, 22&#41;;
went in

Code: Select all

const Score TempoValue = make_score&#40;48, 21&#41;;
the difference (at that point of the SF development, now I don't know) was even huge on some positions although the change was clearly absolutely harmless.

So what you actually measured is _not_ the effect of the change, but the random noise that is typical when using tests positions for testing.

The only lesson from this exercise is to remember that you need real games to comfortably validate a change...or, in case, your judgment. For instance I judge the change absolutely whortless and it won't go in :-) the only effect is for people to read the code to ask themselfs: "Why they made such a rule ? Perhaps is a good idea! " but because it is not I don't want people to be misguided when reading SF.

BTW becaue that line of code is executed always and is needed only once in a milions of cases there is the possibility that the 0,3 ELO that _might_ give is completely offsetted by the slowdown that instead is introduced at every node.

Of course I won't spend even 30 seconds to actually measure this trade-off.
Tord Romstad
Posts: 1808
Joined: Wed Mar 08, 2006 9:19 pm
Location: Oslo, Norway

Re: Stockfish null move pre-condition

Post by Tord Romstad »

bob wrote:It will make measurably smaller trees at no cost.
We've had exactly this discussion several times before.

What you're missing is that we don't use just the score returned by the null move search, but also the move that refutes the null move. We use this move for pruning decisions. Without it, our forward pruning is less accurate.