I tried to follow knuth's analysis to do the same for LMR reduced tree but it is getting complicated already even without applying alpha-beta i.e. just the number of nodes at the leaves of an LMR reduced minmax tree. I will apply alpha-beta later on top of different pruning methods (lmr & nullmove) later to see by how much lmr and nullmove can improve upon the lower/upper bounds of alpha-beta given in knuth. It is going to be really difficult (if at all possible) since even Knuth made a mistake on analysing alpha-beta originally. Anyway given a minmax tree of width b and depth d, then assuming d > b for the sake of continuity:
-------------
case 1: Reduce k moves out of b by the same amount of 1. The number of leaf nodes at depth d-i follows a binomial distribution!
Code: Select all
Nodes at depth d-i = C(d,i) (k)^(i) (b-k)^(d-i)
We can rewrite it interms of probability of reducing a node p = k / b
Code: Select all
Nodes at depth d-i = [ C(d,i) (p)^(i) (1-p)^(d-i) ] * (b^d)
The binomial term is the difference with minmax tree which has (b^d) leaf nodes
------------------
case 2: Reduction of by different amounts such as reduce by 1 more every 8th move as is common now in many engines.
Assume
Code: Select all
x_i = the number of nodes reduced by i
y_i = the number of times reduction by i happened along the path to the leaf
Then this is a case of multi-nomial distribution! It makes sense because it has to converge to case 1 for reduce by 1 or not case.
Code: Select all
Nodes at depth d - sum(y_i * i) = (d! / x0!x1!...xk!) x0^y0 x1^y1 ... xk^yk.
Note that we have to sum node count contributions from different reduction paths to get the total at a ply. A reduction by 3 i.e. leaves at d-3 can be obtained by any of reduction by (1,1,1) or (2,1,0) or (3,0,0). It is difficult to write that out (or atleast i lack the mathematical savvynes). In any case this is too complicated for further analysis with alpha-beta so I will use case (1) henceforth.
-------
Random redcution i.e. a node has 50-50 chance of being reduced or not. i.e p = 1/2
Code: Select all
Nodes at depth d-i = C(d,i) (b/2)^d
----------
Simplified case that I will be using for analyzing lmr with alpha-beta is where all moves except the first one are reduced by 1. That is very much what many do in real engines anyway
Code: Select all
Nodes at depth d-i = C(d,i) (b-1)^(d-i)
----------
Analysing null move is hard and i haven't started analyzing much but here are my thoughts.
a) The probability of a cutoff by null move have to be assumed say q. If q=0 i.e no cutoffs null move is a burden as it will just increase branching factor by a little.
b) Consucative null moves are not allowed i.e move NM move NM move. So at odd plies branching factor increases by 1 to b+1 and at even it is b. It may be better to disregard the order and assume we have ceil(n/2) real moves with floor(n/2) null moves.
c) The reduction amount R have to be assumed and the tree analyzed like lmr. If nullmove is added on top of the distribution will always be multi-nomial.
It looks like finishing null move analysis is a dounting task with all these new additions.
-----------------
To classify nodes to PV/CUT/ALL we need to apply alpha-beta and i haven't started doing that. I guess that this analysis may help to identify by how much lmr and nullmove can help to reduce an alpha-beta tree. Both are unsound prunings since they can prune beyound the minimal tree but we would be stuck with sqrt(b^d) with alpha-beta only. Determining upper bounds with random pruning and reductions is far far complicated than this and i doubt i can work it out by myself, or if it is even possible. Knuth assumed no deep-cutoffs for alpha-beta with his analysis, even though he added them in the later sections. It is just too much right now.
Perft buddies can join in i guess, only this time the problem is to work out number of cut nodes and leaf nodes for different pruned trees. I haven't simulated or checked with a program if what i am doing is correct but i will do it later with TSCP. My laptop is toasted after i spilled liquid on it so i won't be writing programs quickly.
Daniel