Spark-1.0-osx-mp Segmentation Fault

Discussion of anything and everything relating to chess playing software and machines.

Moderators: hgm, Rebel, chrisw

zullil
Posts: 6442
Joined: Tue Jan 09, 2007 12:31 am
Location: PA USA
Full name: Louis Zulli

Spark-1.0-osx-mp Segmentation Fault

Post by zullil »

First, thanks for the engine, and for distributing an OS X binary.

I get a crash when I try to use 4096M of hash. I have sufficient RAM, and can use 4096M with other engines. In practice I would never use this much, but I want to report the issue nonetheless. There is no problem with 2048M.

Code: Select all

LZsMacPro-OSX6: ~/Documents/Chess/Spark/spark-1.0] ./spark-1.0-osx-mp 
spark-1.0
(c) 2009-2010 AJ Siemelink

inifile=spark.ini

book=spark.bbk, 4651 moves
book.file=spark.bbk
allocating hash...allocated 32Mb
starting threads...started 16 threads

initialized
% uci

id name spark-1.0
id author AJ Siemelink
option name UCI_EngineAbout type string default spark-1.0 by AJ Siemelink
option name UCI_Opponent    type string
option name UCI_AnalyseMode type check default false
option name Threads         type spin   default 16 min 1 max 16
option name Hash            type spin   default 32 min 1 max 4096
option name Ponder          type check  default true
option name MultiPV         type spin   default 1 min 1 max 256
option name MultiPVMargin   type spin   default 100 min 0 max 2000
option name OwnBook         type check  default true
option name eval.mobility.opening      type spin default 100 min 0 max 1000
option name eval.mobility.endgame      type spin default 100 min 0 max 1000
option name eval.passedpawns.opening   type spin default 100 min 0 max 1000
option name eval.passedpawns.endgame   type spin default 100 min 0 max 1000
option name eval.pawnstructure.opening type spin default 100 min 0 max 1000
option name eval.pawnstructure.endgame type spin default 100 min 0 max 1000
option name eval.kingattack            type spin default 100 min 0 max 1000
uciok
setoption name Hash value 4096
isready
readyok
position fen rn2r1k1/ppq1pp1p/2b2bp1/8/2BNPP1B/2P4P/P1Q3P1/1R3RK1 w - - 1 18
go
info depth 1 currmovenumber 1 currmove a2a4 time 0 nodes 0 hashfull 0
Segmentation fault
Allard Siemelink
Posts: 297
Joined: Fri Jun 30, 2006 9:30 pm
Location: Netherlands

Re: Spark-1.0-osx-mp Segmentation Fault when hash>=4096Mb

Post by Allard Siemelink »

Thanks for reporting, I'll get it fixed in the next version
zullil
Posts: 6442
Joined: Tue Jan 09, 2007 12:31 am
Location: PA USA
Full name: Louis Zulli

Re: Spark-1.0-osx-mp Segmentation Fault when hash>=4096Mb

Post by zullil »

The same thing happens with the OS X binary of Spark-0.4, by the way.
Allard Siemelink
Posts: 297
Joined: Fri Jun 30, 2006 9:30 pm
Location: Netherlands

Re: Spark-1.0-osx-mp Segmentation Fault when hash>=4096Mb

Post by Allard Siemelink »

zullil wrote:The same thing happens with the OS X binary of Spark-0.4, by the way.
Yes, it is actually a known bug (see history.txt), appearantly on all OS's.
Unfortunately, it slipped my mind to get it fixed before building all releases.