Ken Thompson's 1996 paper

Discussion of chess software programming and technical issues.

Moderator: Ras

syzygy
Posts: 5904
Joined: Tue Feb 28, 2012 11:56 pm

Ken Thompson's 1996 paper

Post by syzygy »

I am re-reading Ken Thompson's 1996 paper on generating 6-piece endgame databases:
https://ia800405.us.archive.org/21/item ... ndgame.pdf

It seems to me the algorithm he presents contains an unnecessary (sub)step.

In Figure 3 he presents an improved version of his 1986 algorithm:
Figure 3 wrote: 1. initial mates over all positions

B = btmmate(~0)
BTM = 0
WTM = 0
n = 0

2. record last pass

BTM |= B
RES |= res(B, n)
n += 1

3. White to move and win in n

W = wunmove(B)
WTM |= W

4. Black to move and might lose in n

X = bunmove(W & ~WTM)

5. verify Black to move and lose in n

B = bmove(X & ~BTM) => WTM
goto 2
(The improvement consisted in moving the "& ~WTM/BTM" operations from steps 3 and 4 to steps 4 and 5. As far as I understand, this was important for his implementation because his PC did not have enough RAM to store a complete 6-piece bitmap, and in this way some random I/O was replaced with sequential I/O.)

In his algorithm, B, W , BTM, WTM are all bitmaps:
W = white-to-move wins in n
B = black-to-move losses in n
WTM and BTM are "summation" bitmaps, i.e. WTM = white-to-move wins in <= n, BTM = black-to-move loses in <= n.

In step 3/4, the wtm wins in n are found by "unmoving" the btm losses in (n-1) and (in step 4) removing the wins in <= n-1 found earlier ("& ~WTM").

In step 4/5, the bitmap X of potential losses in n is calculated by "unmoving" the wtm wins-in-n and (in step 5) removing the losses in <= n-1 found earlier ("& ~BTM").

However, a btm position found by unmoving from a win-in-n wtm position has at least one move to a win-in-n position and therefore cannot lose faster than in n. It is therefore impossible that the "& ~BTM" operation, which removes losses in <= n-1, removes any positions from bunmove(W). In other words, step 5 can be simplified to:
B = bmove(X) => WTM

Note that Figure 12 of the paper, which shows code for step 5, indeed contains two lines corresponding to "& ~BTM":

Code: Select all

   BTM = ioinit("BTM, RAND|IN);
and

Code: Select all

   if(bittst(BTM, &g) continue;
The first line should clearly have read "BTM = ioinit("BTM", SEQ|IN);".
See also the text describing Fig. 12, which mentions "and test it for a new Black-to-move position".
I suppose the actual code of his generator is not available. Otherwise we could check if he could have saved a few days of generation time. (Generating KRBvKBN apparently took three months.)
syzygy
Posts: 5904
Joined: Tue Feb 28, 2012 11:56 pm

Re: Ken Thompson's 1996 paper

Post by syzygy »

Regarding his lack of RAM:

In 1986 he used a machine with 16 MB of RAM. This was indeed just about sufficient for a full bitmap of 462x64x64x64 positions = 121110528 bits = 14.4375 MB.

In 1996 he used a machine with 250 MB of RAM. But now his scheme would have required 924 MB of RAM to store a full bitmap (the paper mentions 970 MB; his MBs are apparently 1000x1000 bytes). So the random-access passes of his algorithm will have run quite slowly. But with kings as the leading pieces in his indexing scheme, access locality should not have been too bad.

With my current K-slicing scheme, 40 MB of RAM (20x64x64x64x64/8 bytes) would have been enough, so a 48 MB PC could have run his algorithm essentially without random I/O. (With more compact indexing it would have been 31.92 MB.)
User avatar
towforce
Posts: 12837
Joined: Thu Mar 09, 2006 12:57 am
Location: Birmingham UK
Full name: Graham Laight

Re: Ken Thompson's 1996 paper

Post by towforce »

Worth asking him: he might want to be left alone in retirement, or he might be happy to answer!

https://en.wikipedia.org/wiki/Ken_Thompson
Human chess is partly about tactics and strategy, but mostly about memory