I understand his code, it is in Crafty as well. My point was that this is less a pruning idea since we are not really pruning anything, we are just short-cutting the search code by avoiding the make/unmake/recursive-search-call. Very much like lazy eval avoids executing code that we notice is not going to help us at all.Gerd Isenberg wrote:Of course futility pruning (depth == 1) only saves make/unmake (and (lazy) eval), nevertheless it is forward-pruning per definition, even more extended futility pruning and limited razoring. This is Heinz' code:bob wrote: Yes, but read my explanation. You have one more move before you drop to the q-search. Rather than making it and then calling the q-search, you just ask "will this produce a stand-pat fail high if I try it and advance to the next ply?" and if the answer is "yes" you just save the effort of making/unmaking the move. It really isn't pruning as I would define it. The only "risk" is that the move you don't make will also have an influence on the stand-pat score, so you toss in an error_margin type value to hopefully account for that.
And he is not "skipping the moves" if you look at his code for the other versions, he just reduces the depth by 1 ply, he never avoids searching them, at least not in the copy of his book that I used when implementing this stuff in Crafty when I tested it last year.Code: Select all
int search(int alpha, int beta, int depth, int move, node parent) { node current; int extend, fmax, fprune, fscore, score, selective; /* execute the opponent's move and initialize some local variables */ make move(parent, move, ¤t); fprune = selective = 0; score = -INFINITY; /* determine if and how to extend the search at the current node */ extend = extensions(depth, move, current); depth += extend; /* decide about limited razoring at pre-pre-frontier nodes */ fscore = (mat balance(current) + razor margin); if (!extend && (depth==PRE_PRE_FRONTIER) && (fscore <= alpha) && (opposing pieces(current) > 3)) depth = PRE_FRONTIER; /* decide about extended futility pruning at pre-frontier nodes */ fscore = (mat balance(current) + extd futil margin); if (!extend && (depth == PRE_FRONTIER) && (fscore <= alpha)) { fprune = selective = 1; score = fmax = fscore; } /* decide about selective futility pruning at frontier nodes */ fscore = (mat balance(current) + futil margin); if (!check(move) && (depth == FRONTIER) && (fscore <= alpha)) { fprune = selective = 1; score = fmax = fscore; } /* "selective == 1" and "-INFINITY < score <= alpha" hold if * the search will be selective at the current node. * * continue like normal but prune all futile moves if "fprune == 1" */ ... for (move= first(current), (move!=0), move= next(current,move)) if (!fprune || check(move) /* recursive search call */ || (fmax + mat gain(move) > alpha)) {...} ... } Figure 1: Search with Extended Futility Pruning and Limited Razoring.
If you consider reductions to be pruning, then we are still in agreement since I believe this is more correctly called a reduction approach since that is the effective result. Typically, in AI, "pruning" means to take a branch and declare "we will not search this branch at all..." which is not exactly what a reduction does, although in effect it does since reduced depth eliminates some branches automatically.