Best practices for transposition tables

Discussion of chess software programming and technical issues.

Moderators: hgm, Rebel, chrisw

lojic
Posts: 71
Joined: Thu Jan 28, 2021 9:50 pm
Full name: Brian Adkins

Best practices for transposition tables

Post by lojic »

I just implemented a simple piece square table, and that seems to have improved the play. I think I'd like to implement a transposition table next. Is there a good guide for implementing a TT? I did search the forum and found some info, but nothing I would consider complete or authoritative.

I'm thinking I should probably change how I handle castling rights in preparation for the TT. I mentioned earlier that I don't keep castling rights in the game state, I keep the king positions in the game state, and I keep a "has moved" bit in the pieces, so to determine castling rights, I see if the king has not moved, and then I see if the rooks are in their original spots and have not moved. However, it seems doing castling rights the traditional way may make it easier to implement the TT table.

I don't think I need the "has moved" bit. Checking the rank for the double pawn push seems sufficient, and if I keep castling rights in the game state, there seems to be no reason to keep the "has moved" bit, and to update it.

CPW mentions both Zobrist hashing and BCH hashing. Is one clearly better?

I would expect to be able to find more articles and/or blog posts for the key topics. I've found CPW helpful, but it often doesn't go deep enough on a topic for me. If there is a good tutorial for transposition tables, I'd love to read/see it.
User avatar
hgm
Posts: 27796
Joined: Fri Mar 10, 2006 10:06 am
Location: Amsterdam
Full name: H G Muller

Re: Best practices for transposition tables

Post by hgm »

I think Zobrist hashing is used almost universally; I had not even heard of the other one. Putting a 'has-moved' bit in the pieces basically makes the virgin and non-virgin pieces different piece types, and if your Zobrist hashing treats it as such it would also automatically account for the castling rights. If the virgin and non-virgin pieces use the same key, however, you would be 'castling blind' if you would not explicity deduce the castling rights from the virgin bits, and use those to alter the key somehow. This is cumbersome when the rights are distributed over several pieces. In micro-Max I can get away with being castling blind, because castlings can never become hash moves for other reasons anyway. (Which then causes a performance hit that was affordable considering the code reduction.)

So it is better to keep track of all the rights in a single variable, and make the key dependent on that. (Even if this would be as simple as just XORing that variable with the key.)

You should realize that using a very simple TT (e.g. the always-replace TT that micro-Max uses) and using a very complex one will make only very little difference on the strength of the engine. Next step up from 'always-replace' would be 'shallowest-of-2' replacement, where you group the entires in pairs, and then always overwrite the least deep entry of the pair (if the position was not already in there, in which case you would just update it irrespective of its depth). But you would have to combine that with aging somehow, otherwise the part of the TT holding the deep entries will hang on to entries that have become useless. In KingSlayer I use a combination of 'shallowest-of-2' combined with 'undercut' replacement to prevent accumulation of ever-higher depth (replacing the deepest entry of the pair when the new one has a depth of exactly one less). This already works very well, and is quite simple.

Storing a key signature of 4 bytes in the entry is more than enough, when these are the key bits not used to derive the index.
Nomis
Posts: 18
Joined: Sun Jan 03, 2021 1:19 pm
Full name: Simon Reed

Re: Best practices for transposition tables

Post by Nomis »

Hello,

I remember this from the Mediocre Chess blog : http://mediocrechess.blogspot.com/2007/ ... ables.html
lojic
Posts: 71
Joined: Thu Jan 28, 2021 9:50 pm
Full name: Brian Adkins

Re: Best practices for transposition tables

Post by lojic »

hgm wrote: Sat Feb 06, 2021 7:22 pm I think Zobrist hashing is used almost universally; ...
Excellent, I'll go with Zobrist.
hgm wrote: Sat Feb 06, 2021 7:22 pm ...
You should realize that using a very simple TT (e.g. the always-replace TT that micro-Max uses) and using a very complex one will make only very little difference on the strength of the engine.
If I'm understanding you correctly, most of the value comes with simply having a TT, and getting "fancy" with the TT will only give marginal gains.
User avatar
hgm
Posts: 27796
Joined: Fri Mar 10, 2006 10:06 am
Location: Amsterdam
Full name: H G Muller

Re: Best practices for transposition tables

Post by hgm »

Exactly. I suppose this is partly because memory is not a serious limitation nowadays, so that what we now consider a very small table can still hold the entire search tree with ease. Then it hardly matters what replacements scheme you use, as there is no need to replace anything. Matters might be different if you had to live with a table that can hold only, say, 0.001% of the tree.

Important minimum requirement is that new positions should always go into the table, no matter how precious the entry seems that would have to be overwritten for this. 'Recent' is always a more important quality than 'deep'.

The TT causes some speedup during the middle-game, but that only gives a minor Elo increase. The most important effect on game results is in the late end-game, in particular in Pawn endings, where the tree almost entirely consists of transpositions. For a King on an empty board there would be 8^N positions after N moves (well, 6.6^N if you account for effects of the board edge), but even in the long run there an never be more than 64 places where the King can be. This basically means the engine can suddenly see arbitrarily deep in Pawn endings, where this is often really needed to see who will promote first.
Sven
Posts: 4052
Joined: Thu May 15, 2008 9:57 pm
Location: Berlin, Germany
Full name: Sven Schüle

Re: Best practices for transposition tables

Post by Sven »

I think in middlegame positions the biggest gain you get from a TT is the move ordering improvement. The saving through transpositions is not the most important part there.
Sven Schüle (engine author: Jumbo, KnockOut, Surprise)
User avatar
hgm
Posts: 27796
Joined: Fri Mar 10, 2006 10:06 am
Location: Amsterdam
Full name: H G Muller

Re: Best practices for transposition tables

Post by hgm »

Sure, I never said anything about what would cause the speedup. But not having a TT move means you would have to get it through IDD. That obviously takes some extra time, but not a huge factor, as the pilot search takes only a small fraction of the time needed for the full-depth search.
Sven
Posts: 4052
Joined: Thu May 15, 2008 9:57 pm
Location: Berlin, Germany
Full name: Sven Schüle

Re: Best practices for transposition tables

Post by Sven »

hgm wrote: Sun Feb 07, 2021 7:45 pm IDD
I guess you mean IID 😉
Sven Schüle (engine author: Jumbo, KnockOut, Surprise)
User avatar
hgm
Posts: 27796
Joined: Fri Mar 10, 2006 10:06 am
Location: Amsterdam
Full name: H G Muller

Re: Best practices for transposition tables

Post by hgm »

Ah yes, of couse :oops: For the uninitiated: Internal Iterative Deepening.
User avatar
mvanthoor
Posts: 1784
Joined: Wed Jul 03, 2019 4:42 pm
Location: Netherlands
Full name: Marcel Vanthoor

Re: Best practices for transposition tables

Post by mvanthoor »

Sven wrote: Sun Feb 07, 2021 5:17 pm I think in middlegame positions the biggest gain you get from a TT is the move ordering improvement. The saving through transpositions is not the most important part there.
+1

I've just re-instated my old transposition table implementation from a year ago, which was written explicitly for perft. It only stores leaf node counts (and zobrist and depth), so I'll have to change the data structures to be able to save the information needed to use the table for playing games, but its a (working) start.

Even though perft speeds up by 70-80%, even for a hash table of only 32 MB, you can only get 1-2 moves deeper in most positions. Node counts increase so fast, that using 25% instead of 100% from a massive amount of time is still a massive amount of time.

The sorting and subsequent cutting of the size of the tree is indeed what is going to gain the most speed; but on top of that, the TT will probably still add 1-2 ply. I'm curious to see how much deeper the program can think when adding a TT and using it to sort the best move first, on top of an otherwise very simple search function.
Author of Rustic, an engine written in Rust.
Releases | Code | Docs | Progress | CCRL