I get a crash when I try to use 4096M of hash. I have sufficient RAM, and can use 4096M with other engines. In practice I would never use this much, but I want to report the issue nonetheless. There is no problem with 2048M.
Code: Select all
LZsMacPro-OSX6: ~/Documents/Chess/Spark/spark-1.0] ./spark-1.0-osx-mp
spark-1.0
(c) 2009-2010 AJ Siemelink
inifile=spark.ini
book=spark.bbk, 4651 moves
book.file=spark.bbk
allocating hash...allocated 32Mb
starting threads...started 16 threads
initialized
% uci
id name spark-1.0
id author AJ Siemelink
option name UCI_EngineAbout type string default spark-1.0 by AJ Siemelink
option name UCI_Opponent type string
option name UCI_AnalyseMode type check default false
option name Threads type spin default 16 min 1 max 16
option name Hash type spin default 32 min 1 max 4096
option name Ponder type check default true
option name MultiPV type spin default 1 min 1 max 256
option name MultiPVMargin type spin default 100 min 0 max 2000
option name OwnBook type check default true
option name eval.mobility.opening type spin default 100 min 0 max 1000
option name eval.mobility.endgame type spin default 100 min 0 max 1000
option name eval.passedpawns.opening type spin default 100 min 0 max 1000
option name eval.passedpawns.endgame type spin default 100 min 0 max 1000
option name eval.pawnstructure.opening type spin default 100 min 0 max 1000
option name eval.pawnstructure.endgame type spin default 100 min 0 max 1000
option name eval.kingattack type spin default 100 min 0 max 1000
uciok
setoption name Hash value 4096
isready
readyok
position fen rn2r1k1/ppq1pp1p/2b2bp1/8/2BNPP1B/2P4P/P1Q3P1/1R3RK1 w - - 1 18
go
info depth 1 currmovenumber 1 currmove a2a4 time 0 nodes 0 hashfull 0
Segmentation fault