Stockfish NN release (NNUE)

Discussion of anything and everything relating to chess playing software and machines.

Moderators: hgm, Rebel, chrisw

Steppenwolf
Posts: 75
Joined: Thu Jan 31, 2019 4:54 pm
Full name: Sven Steppenwolf

Re: Stockfish NN release (NNUE)

Post by Steppenwolf »

Is there any macOS binary version available?

Under Parallels/Win10 and Fritz17 GUI the engine (stockfish.nnue.halfkp_256x2-32-32.exe) is not working.
Thanks!
User avatar
cdani
Posts: 2204
Joined: Sat Jan 18, 2014 10:24 am
Location: Andorra

Re: Stockfish NN release (NNUE)

Post by cdani »

Dariusz Orzechowski wrote: Wed Jun 17, 2020 2:04 am Some more results with two nets from this thread against the latest sf dev. 1 thread, tc 1m+1s, the same 100 short normal openings for both matches. Bigger net is more solid.

Code: Select all

sf dev 150620 - sf nnue halfkp256 : 116,5/200 +57-24=119
sf dev 150620 - sf nnue halfkp384 : 109,5/200 +41-22=137
I also did a test between the two nets. The 256 net maybe is a bit stronger. 1m + 0.2 s. Big database of varied openings of different lengths.

Code: Select all

 # PLAYER          : RATING  ERROR   POINTS  PLAYED    (%)
   1 stnnue256     : 2860.3    6.6    649.0    1267   51.2%
   2 stnnue384     : 2851.7    6.6    618.0    1267   48.8%
Jesse Gersenson
Posts: 593
Joined: Sat Aug 20, 2011 9:43 am

Re: Stockfish NN release (NNUE)

Post by Jesse Gersenson »

To run on linux, I made a Dockerfile:
https://github.com/jessegersensonchess/ ... nue-docker
Jesse Gersenson
Posts: 593
Joined: Sat Aug 20, 2011 9:43 am

Re: Stockfish NN release (NNUE)

Post by Jesse Gersenson »

How can I confirm it's using the neural net?

I tried these two sets of commands

Code: Select all

setoption name Threads value 80
setoption name Hash value 8192
go depth 28

info depth 28 seldepth 43 multipv 1 score cp 49 nodes 597386734 nps 49143364 hashfull 456 tbhits 0 time 12156 pv e2e4 e7e6 g1f3 d7d5 e4d5 e6d5 d2d4 f8d6 c2c4 g8f6 b1c3 e8g8 c4c5 d6e7 f1d3 b7b6 c5b6 a7b6 e1g1 c7c5 f3e5 c8b7 d3b5 c5d4 c3e2 e7d6 e5f3 d4d3 e2d4 b8a6 b5d3 a6b4 d3b5 a8a2 a1a2 b4a2 c1e3 a2b4
quit

Code: Select all

setoption name EvalDir value /data/stockfish_nn.bin
setoption name Threads value 80
setoption name Hash value 8192
go depth 28

info depth 28 seldepth 46 multipv 1 score cp 43 nodes 730138980 nps 52486448 hashfull 515 tbhits 0 time 13911 pv d2d4 g8f6 c2c4 e7e6 g1f3 d7d5 b1c3 c7c6 e2e3 b8d7 d1c2 f8d6 f1d3 e8g8 e1g1 f8e8 b2b3 h7h6 c1b2 d8e7 a1e1 e6e5 d4e5 d7e5 f3e5 d6e5 c4d5 c6d5 c3b5 c8g4 b2e5 e7e5 c2c5 e5e7 c5c7 a8c8 c7e7 e8e7 h2h3 a7a6 h3g4 a6b5 d3b5 f6g4
bestmove d2d4 ponder g8f6
Raphexon
Posts: 476
Joined: Sun Mar 17, 2019 12:00 pm
Full name: Henk Drost

Re: Stockfish NN release (NNUE)

Post by Raphexon »

Jesse Gersenson wrote: Sat Jun 20, 2020 10:11 am How can I confirm it's using the neural net?

I tried these two sets of commands

Code: Select all

setoption name Threads value 80
setoption name Hash value 8192
go depth 28

info depth 28 seldepth 43 multipv 1 score cp 49 nodes 597386734 nps 49143364 hashfull 456 tbhits 0 time 12156 pv e2e4 e7e6 g1f3 d7d5 e4d5 e6d5 d2d4 f8d6 c2c4 g8f6 b1c3 e8g8 c4c5 d6e7 f1d3 b7b6 c5b6 a7b6 e1g1 c7c5 f3e5 c8b7 d3b5 c5d4 c3e2 e7d6 e5f3 d4d3 e2d4 b8a6 b5d3 a6b4 d3b5 a8a2 a1a2 b4a2 c1e3 a2b4
quit

Code: Select all

setoption name EvalDir value /data/stockfish_nn.bin
setoption name Threads value 80
setoption name Hash value 8192
go depth 28

info depth 28 seldepth 46 multipv 1 score cp 43 nodes 730138980 nps 52486448 hashfull 515 tbhits 0 time 13911 pv d2d4 g8f6 c2c4 e7e6 g1f3 d7d5 b1c3 c7c6 e2e3 b8d7 d1c2 f8d6 f1d3 e8g8 e1g1 f8e8 b2b3 h7h6 c1b2 d8e7 a1e1 e6e5 d4e5 d7e5 f3e5 d6e5 c4d5 c6d5 c3b5 c8g4 b2e5 e7e5 c2c5 e5e7 c5c7 a8c8 c7e7 e8e7 h2h3 a7a6 h3g4 a6b5 d3b5 f6g4
bestmove d2d4 ponder g8f6
"nps 49143364"
"nps 52486448"

What's your normal nps? The NN should be around 40-50% slower than regular Stockfish.
Also what binary are you using?
kranium
Posts: 2129
Joined: Thu May 29, 2008 10:43 am

Re: Stockfish NN release (NNUE)

Post by kranium »

Jesse Gersenson wrote: Sat Jun 20, 2020 10:11 am How can I confirm it's using the neural net?

I tried these two sets of commands

Code: Select all

setoption name Threads value 80
setoption name Hash value 8192
go depth 28
Hi Jesse-

If you want, you can download the exe I created a couple days ago from here:
https://github.com/FireFather/Stockfish-nnue

It outputs a loaded/not loaded UCI info message
"info string NNUE nn.bin found & loaded..."

Regards-
Norm
User avatar
cdani
Posts: 2204
Joined: Sat Jan 18, 2014 10:24 am
Location: Andorra

Re: Stockfish NN release (NNUE)

Post by cdani »

Can I take a net and improve it with a new "traninig phase" with new generated training data? I have the evalsave folder with the first training.
Thanks.
Jesse Gersenson
Posts: 593
Joined: Sat Aug 20, 2011 9:43 am

Re: Stockfish NN release (NNUE)

Post by Jesse Gersenson »

Raphexon wrote: Sat Jun 20, 2020 10:45 am
"nps 49143364"
"nps 52486448"

What's your normal nps? The NN should be around 40-50% slower than regular Stockfish.
Also what binary are you using?
using this binary "/Stockfish/src/stockfish"

I included two outputs. One is with the NN loaded, the other is without the NN loaded. rates were 49143364 and 52486448. Which is not the 40-50% you describe.
User avatar
cdani
Posts: 2204
Joined: Sat Jan 18, 2014 10:24 am
Location: Andorra

Re: Stockfish NN release (NNUE)

Post by cdani »

Jesse Gersenson wrote: Sat Jun 20, 2020 3:35 pm
Raphexon wrote: Sat Jun 20, 2020 10:45 am
"nps 49143364"
"nps 52486448"

What's your normal nps? The NN should be around 40-50% slower than regular Stockfish.
Also what binary are you using?
using this binary "/Stockfish/src/stockfish"

I included two outputs. One is with the NN loaded, the other is without the NN loaded. rates were 49143364 and 52486448. Which is not the 40-50% you describe.
The 40-50% is on one thread. I suggest you to try it. Is possible that with that much threads behaves differently.
kranium
Posts: 2129
Joined: Thu May 29, 2008 10:43 am

Re: Stockfish NN release (NNUE)

Post by kranium »

cdani wrote: Sat Jun 20, 2020 3:19 pm Can I take a net and improve it with a new "traninig phase" with new generated training data? I have the evalsave folder with the first training.
Thanks.
yes, according to nodchip's README.txt

"We could repeat the "training data generation phase" and "training phase" again and again with the output NN evaluation functions in the previous iteration.
This is a kind of reinforcement learning.
After the first iteration, please use "stockfish.nnue-learn-use-blas.k-p_256x2-32-32.exe" to generate training data so that we use the output NN parameters in the previous iteration.
Also, please set "SkipLoadingEval" to false in the training phase so that the trainer loads the NN parameters in the previous iteration."

so make sure to load the previous nn.bin beforehand
on my i9-9900k with 16GB hash and 8 threads, it (halfkp) it took about 14 hours to generate...
and another hour or so to generate the validation data.
Last edited by kranium on Sat Jun 20, 2020 6:48 pm, edited 2 times in total.