Onno Garms wrote:Code: Select all
// Even with a move opponent is not so well off.
// So null move pruning is unlikely to be successful.
if (SearchInnerRc::null_skip_by_trans && trans.max_value <= p_alpha && trans.max_depth >= p_depth - SearchInnerRc::null_reduction)
*v_no_nmp = true;
SmarThink declines null move more aggressively. Probably you don't need "trans.max_depth >= p_depth - SearchInnerRc::null_reduction" because if at any depth you're failed with null move, it looks to be too dangerous.
Moreover, I'm using some case of IID with null move — trying to search it with depth = 0, 1, 2 ... till (depth – null_reduction). Only if every search (including simple static eval) returns value >= beta then make cutoff.
This approach is combined with relatively larger reduction comparing to other programs.
I think null-move subtree is very specific and probably it's a good idea to make more reductions here. Because it's relatively safe to falsely decline reduction — you will just search more, but if it will be compensated by enought pruned nodes in subtree — it will be clear advantage.
I think it's worth to try something like flags for search to indicate, if there was null move for each side. If we're in subtree where white null move was done we can prune more white moves despite these cutoffs will be completely incorrect (for example, just exclude moves with SEE < 0, more than 8 quiet white moves, moves that can't return value >= beta with very reduced depth and so on). Moreover, resulting score can be more valuable, because it will exclude some non-obvious refutations of answers to null move; if null move answers are hard to refute, probably it's better to make full search to explore this position.
I think it's possible to tune this framework very well.