Stockfish version with hash saving capability

Discussion of anything and everything relating to chess playing software and machines.

Moderators: hgm, Rebel, chrisw

Rodolfo Leoni
Posts: 545
Joined: Tue Jun 06, 2017 4:49 pm
Location: Italy

Re: Stockfish version with hash saving capability

Post by Rodolfo Leoni »

peter wrote:The time for saving and loading is not the problem.
The problem is, you have always new OR old hash, principally it's the same with overwriting by ongoing analysis you must deal with anyhow.

Only being able to merge new and old hash would help with that
When it take 10 minutes to load one hash and to save it, and you have 4-5 positions (that means 4-5 hashes 2 GB size), it becomes a problem... I usually load the last saved hash for an opponent, I run an analysis (which probably will overwrite many things before that game end) and when I decide the move I save the hash.

I guess you mean hashes should be merged and pruned. If not, they'd become huge, at least for my misere 4 GB RAM PC. :)
F.S.I. Chess Teacher
peter
Posts: 3185
Joined: Sat Feb 16, 2008 7:38 am
Full name: Peter Martan

Re: Stockfish version with hash saving capability

Post by peter »

SF PA- GTB had the UCI command Merge Hash but it was for its learning file.
At the moment for all engines having a Save Hash- Option, reload means deleting of the hash in use. To merge e.g. 8G saved hash with 8G already in use shouldn't take longer than to reload 8, or would it? Of course I don't really know not being able to program that by myself.

Neither do I understand the sense of loading .epd- files into hash. If the positions aren't evaluated by the engine already, what does the engine earn?

10 minutes to save 2G seems pretty much to me, I guess that's rather about 10 seconds at my old 12 core Intel, and I don't think it's a question of CPU mainly.

The speed of the disk (SSD) and of the RAM might matter more
Peter.
Rodolfo Leoni
Posts: 545
Joined: Tue Jun 06, 2017 4:49 pm
Location: Italy

Re: Stockfish version with hash saving capability

Post by Rodolfo Leoni »

peter wrote:SF PA- GTB had the UCI command Merge Hash but it was for its learning file.
At the moment for all engines having a Save Hash- Option, reload means deleting of the hash in use. To merge e.g. 8G saved hash with 8G already in use shouldn't take longer than to reload 8, or would it? Of course I don't really know not being able to program that by myself.

Neither do I understand the sense of loading .epd- files into hash. If the positions aren't evaluated by the engine already, what does the engine earn?

10 minutes to save 2G seems pretty much to me, I guess that's rather about 10 seconds at my old 12 core Intel, and I don't think it's a question of CPU mainly.

The speed of the disk (SSD) and of the RAM might matter more
SF PA GTB had no problems with PH file size. It usually is few MBs, and when merging with new positions its size increased. If you merge 8 GBs hash with 8 GBs file you get a 16 GBs file, I guess. Unless you can somehow "prune" it. Then, we have the same problem as losing some key variation scores when in analysis: you have to cut something anyway. Shallower positions, I guess.

I think Dann asked for an import epd feature because he has a lot of positions already evaluated. Maybe he wants to run some minmax.

About the time it needs to load/save.... I always doubt the process could be incomplete and I wait. I check if there're disk accesses and I'm never sure it's complete. Maybe a message "hashes loaded/hashes saved" is useful to be sure about it...
:)
F.S.I. Chess Teacher
retep1
Posts: 44
Joined: Sun Aug 07, 2016 5:24 pm

Re: Stockfish version with hash saving capability

Post by retep1 »

Maybe a message "hashes loaded/hashes saved" is useful to be sure about it.
That's exactly what I miss too
peter
Posts: 3185
Joined: Sat Feb 16, 2008 7:38 am
Full name: Peter Martan

Re: Stockfish version with hash saving capability

Post by peter »

Rodolfo Leoni wrote:If you merge 8 GBs hash with 8 GBs file you get a 16 GBs file, I guess. Unless you can somehow "prune" it. Then, we have the same problem as losing some key variation scores when in analysis: you have to cut something anyway.
Engine will do that with further progress of computing as well as with 32G, what is most of the times the amount of hash I use.

Of course there can be more already evaluated positions and moves taken into comparison the bigger the hash is.
And why not merge 1G old with 1G new hash for short times of saving?
Undoubted single best moves found only after some time of search yet would be kept in saved hash once found still.

And merging yet should not take more time then reloading does, I'm still hoping so at least.
:)
Peter.
Rodolfo Leoni
Posts: 545
Joined: Tue Jun 06, 2017 4:49 pm
Location: Italy

Re: Stockfish version with hash saving capability

Post by Rodolfo Leoni »

peter wrote:
Rodolfo Leoni wrote:If you merge 8 GBs hash with 8 GBs file you get a 16 GBs file, I guess. Unless you can somehow "prune" it. Then, we have the same problem as losing some key variation scores when in analysis: you have to cut something anyway.
Engine will do that with further progress of computing as well as with 32G, what is most of the times the amount of hash I use.

Of course there can be more already evaluated positions and moves taken into comparison the bigger the hash is.
And why not merge 1G old with 1G new hash for short times of saving?
Undoubted single best moves found only after some time of search yet would be kept in saved hash once found still.

And merging yet should not take more time then reloading does, I'm still hoping so at least.
:)
Yes, I guess it's an interesting option for those whose PC has a lot of RAM. I hope Daniel will consider that option. And I hope all that stuff isn't taking too much time.

Steps forward to a new SF PA format. :)
F.S.I. Chess Teacher
duncan
Posts: 12038
Joined: Mon Jul 07, 2008 10:50 pm

Re: Stockfish version with hash saving capability

Post by duncan »

zullil wrote:
duncan wrote:
zullil wrote: asmFish eventually found mate-in-11 from the original position. But it took a while.

Code: Select all

info depth 54 seldepth 24 multipv 1 time 5491130 nps 50956798 score mate 11 nodes 279810404462 hashfull 999 tbhits 0 pv a2a4 h7h5 d2d4 c7c6 e2e4 g7g6 d4d5 g6g5 d1h5 e8d8 h5f7 c6d5 f1b5 d5e4 f7d5 d8c8 d5d7 c8b8 b5a6 b8a8 d7b7

is your nps 50956798 which is very fast and quicker than threadripper.? May I ask what sytem you are using?
Not sure what your first sentence is trying to ask. :wink: Yes, the nodes/sec is 50956798, which is indeed very fast. I don't know anything about Threadripper. I have 2 x Intel(R) Xeon(R) CPU E5-2687W v3 @ 3.10GHz and 64GB RAM. I tend to use asmFish because it is NUMA-aware, which seems to improve performance on my system.
the following position is a mater in 33 and is the Longest mate in King, Bishop and Knight versus King endgame.

it takes me a long time on my computer to find the mate in 33. how long do you estimate it should take on your computer .just 10 minutes ?

[d]K7/2kB4/8/8/8/8/8/5N2 w - - 0 1

http://www.gilith.com/chess/endgames/kbn_k.html
Rodolfo Leoni
Posts: 545
Joined: Tue Jun 06, 2017 4:49 pm
Location: Italy

Re: Stockfish version with hash saving capability

Post by Rodolfo Leoni »

cdani wrote:
Rodolfo Leoni wrote: Thanks! Read only entries?
I will try to add this when someone tells that this version that I have done is useful :-)
I'm trying to build some epd positions but I don't understand the format needed for importing them.

Could it be:

FEN ce <score> cd <depth> bm <best move> :?:
F.S.I. Chess Teacher
Rodolfo Leoni
Posts: 545
Joined: Tue Jun 06, 2017 4:49 pm
Location: Italy

Re: Stockfish version with hash saving capability

Post by Rodolfo Leoni »

I tried to import this:

r3kb1r/1bqn1p1p/p1Np1P2/3P4/1p6/5Q2/PPP4P/2KR1BR1 w kq - 3 20 ce 200 cd 40 bm f1h3

but engine hangs.
F.S.I. Chess Teacher
User avatar
cdani
Posts: 2204
Joined: Sat Jan 18, 2014 10:24 am
Location: Andorra

Re: Stockfish version with hash saving capability

Post by cdani »

Here you have an example of an epd file:
http://talkchess.com/forum/viewtopic.ph ... 76&t=64914

Merging hashes is an esoteric concept, as hash has no full info of the positions, i.e. you don't know the positions that are stored. You only know during search that the keys match.