Fast SEE (Ed's lookup revisited)

Discussion of chess software programming and technical issues.

Moderators: hgm, Rebel, chrisw

User avatar
hgm
Posts: 27790
Joined: Fri Mar 10, 2006 10:06 am
Location: Amsterdam
Full name: H G Muller

Fast SEE (Ed's lookup revisited)

Post by hgm »

As I was in need for a fast and simple SEE for directing an MC search, I was thinking in the direction of table lookups. Ed's lookup method is pretty nice, but I would prefer to have a slightly higher accuracy, and smaller tables (to reduce cache pressure).

To make accumulating the attackers and protectors of a square simple, we can just add codes for the individual pieces, according to the scheme P=1, R=3, K=9, minor=18, Q=72. I don't require the system to work with extra pieces from promotion, to this would make the maximum attack/protect code 143. (Note each square can be attacked by at most one Bishop.)

I also want to handle X-ray attacks, though. This poses a problem when a lower piece X-rays a higher one, which can happen when a Bishop, or Rook(s) X-ray a Queen. (Note that X-rays through a King are for practical purposes just a King, as the King will never be captured and thus permanently block any X-raying piece.) By counting a Bishop X-raying a Queen for 72, rather than the normal 18, codes 144-197 would be used to describe the situations where this occurs. (Note there can be at most two direct minor attacks in this case.) By letting a Rook that X-rays a Queen count as 7*18 = 126 instead of 3, and a Rook X-raying both Q and R as 6, the codes 198-269 would be used to indicate cases involving that.

The resulting code would be a 'perfect hash' for the attackers (or protectors) set to the square, uniquely specifying the latter without any data loss. Unfortunately 270 > 256, so if we want to squeeze this info into a byte the 14 code 256-269 would overflow, which again would cause the attackers set to be misidentified in a most-horrible way. But these codes all correspond to situations with at least a QB battery, two N and one R attack, plus one more piece (P, R or K). Which is a rather unlikely situation, to put it mildly.

With Ed's lookup the table was 14 x 256 x 256, because it was not indexed only by encodings of the attackers and protectors sets, but also by the cooupant of the target square. But there is no need to tabulate that, as the result of the LVA capture to that square will simply be the value of the victim there, minus the SEE the move would have had if the square had been empty. So we would only have to tabulate the latter in a 256 x 256 (or 270 x 270) table, and can then caluclate value[victim] - SEE[attackers][protectors]. It doesn't make much difference if we have to add something to the index before the lookup, or something after it.

A 256 x 256 table is still a bit larger than I would like it to be, however. (But it could be that this is just paranoia on my part; it might be very sparsely accessed in a real search, so that it hardly causes any cache pressure.) To shrink it further, I would use the individual attackers and protectors set in a one-dimensional lookup table to reduce the information they contain to the total number of attackers, and the type of the least-valuable two. There are 5 cases of a single attacker (P, minor, R, Q or K), and of course the case of no attackers at all. Then there are 15 possible combinations of two attackers (m = B or N):

PP, Pm, mm, PR, mR, RR, PQ, mQ, RQ, QB (battery), QR (battery), PK, mK, RK, QK.

So in total there are 21 combinations, which in the 1d lookup tables could be encoded as 0-20. By taking the protectors from a different table, which contains the same values but multiplied by 21, and then adding the two, encodings that specify both the two lowest attackers and two lowest protectors would run from 0-440. Which fits in 9 bits. The bits just above that could be used to store the total protectors count, or the negated attackers count, so that in the sum they would contain nr_of_protectors - nr_of_attackers. The sign of this would presumably have a big influence on the estimated outcome of a see that gets deeper than 4 ply, as it determines who will have the last capture when the exchange is continued to the very end. (Which is often not the case; e.g. two Pawn protectors would beat any non-Pawn attacker, no matter how many more attackers there are.) We would then need two 441-entry lookup tables indexed by the low-order 9 bits of the sum to get the SEE results, one for the case attackers > protectors, the other for attackers <= protectors:

int aLookup[270], pLookup[270];
unsigned char SEE[441][2];
index = aLookup[attackers] + pLookup[protectors];
see = value[victim] - SEE[index & 511][index >= 0];
User avatar
Rebel
Posts: 6991
Joined: Thu Aug 18, 2011 12:04 pm

Re: Fast SEE (Ed's lookup revisited)

Post by Rebel »

Some remarks, the two 256 bytes tables were the best I could imagine on a 8-bit computer with only 8192 bytes of RAM. As SEE it was more than good enough for move ordering but the main use was for the evaluation having quite some information of each square, including x-ray.

Good luck with your idea.
90% of coding is debugging, the other 10% is writing bugs.
User avatar
hgm
Posts: 27790
Joined: Fri Mar 10, 2006 10:06 am
Location: Amsterdam
Full name: H G Muller

Re: Fast SEE (Ed's lookup revisited)

Post by hgm »

Thank you.

I am not sure we are talking about the same thing, as the Ed's Lookup described in the Chess-Programming Wiki talks about a 14 x 256 x 256 table, which is 896KB, about a 100 times larger than what would completely fill the 8KB you mention now. The 2-stage lookup I propose here would fit in the 8KB, though.

This morning I realized the table could be shrunk even more by putting the result in nibbles, as SEE can never be larger than 9. So we could do something like

see = (index >= 0 ? SEE[index & 511] & 15 : SEE[index & 511] >> 4);
User avatar
Rebel
Posts: 6991
Joined: Thu Aug 18, 2011 12:04 pm

Re: Fast SEE (Ed's lookup revisited)

Post by Rebel »

To explain, on the 8-bit chip the tables were coded. Later on the PC with the code the [14][256][256] table was created. Unfortunately the square information was too slim for an incremental update, maybe you could try with your approach.
90% of coding is debugging, the other 10% is writing bugs.