Yes it was pretty awesome for its day. I believe each SP node only had access to the 1 GB it had. Probably communications overhead between the "nodes" would take too much time.hammerklavier wrote:Thanks Mark!Murray Campbell reported to me that each SP had 1 GB or ram (A lot for its day). There were 32 nodes, so a total of 32 GB.
32Gb is really fantastic ! in 1998 the most common configuration was 64Mb ram in Pentium MMX/Pentium II
Hash Tables Deep Blue
Moderators: hgm, Rebel, chrisw
-
- Posts: 1494
- Joined: Thu Mar 30, 2006 2:08 pm
Re: Hash Tables Deep Blue
-
- Posts: 6052
- Joined: Tue Jun 12, 2012 12:41 pm
Re: Hash Tables Deep Blue
I do not know about the search in particulars, but while browsing the net, I found a page with Deep Blue evaluation and, next to SF and presumably Komodo, Deep Blue evaluation is fully basic.CheckersGuy wrote:I really wonder what techniques the top engines like stockfish/Komodo use which weren`t used by Deep Blue. Did DB use Null-Move/LMR/ProbCut ? Would be really intresting to knowmjlef wrote:I sent your question on to Murray Campbell. While we wait, I know that part of the search in Deep Blue was done in software. The software had access to Hash tables. At the end noes of the software search a hardware search would be sent to the special chips. The hardware part of the search did not have Hash tables:hammerklavier wrote:How much Hash tables memory did Deep Blue use?
http://ac.els-cdn.com/S0004370201001291 ... 47f33639d8
Not much to gain from hardware these days.
-
- Posts: 5557
- Joined: Tue Feb 28, 2012 11:56 pm
Re: Hash Tables Deep Blue
DB did not prune beyond alpha/beta and that was a conscious choice. They were afraid of losing a game to Fritz or so due to a low-depth oversight.Cardoso wrote:DB didn't use Nul Move, and probably didn't use LMR and ProbCut.I really wonder what techniques the top engines like stockfish/Komodo use which weren`t used by Deep Blue. Did DB use Null-Move/LMR/ProbCut ? Would be really intresting to know Razz
DB's pruning techniques were less intensive that the ones we have today.
In a speech Murray said DB's branching factor was 3 or 4, so as you can see there was not much intense pruning going on on DB's search.
Of course DB would have been much stronger even if they had only added null-move pruning.
-
- Posts: 2016
- Joined: Sun Feb 17, 2008 4:19 pm
Re: Hash Tables Deep Blue
basic enough to make a world chess champion cry.Lyudmil Tsvetkov wrote:I do not know about the search in particulars, but while browsing the net, I found a page with Deep Blue evaluation and, next to SF and presumably Komodo, Deep Blue evaluation is fully basic.CheckersGuy wrote:I really wonder what techniques the top engines like stockfish/Komodo use which weren`t used by Deep Blue. Did DB use Null-Move/LMR/ProbCut ? Would be really intresting to knowmjlef wrote:I sent your question on to Murray Campbell. While we wait, I know that part of the search in Deep Blue was done in software. The software had access to Hash tables. At the end noes of the software search a hardware search would be sent to the special chips. The hardware part of the search did not have Hash tables:hammerklavier wrote:How much Hash tables memory did Deep Blue use?
http://ac.els-cdn.com/S0004370201001291 ... 47f33639d8
no chess program was born totally from one mind. all chess programs have many ideas from many minds.
-
- Posts: 12038
- Joined: Mon Jul 07, 2008 10:50 pm
Re: Hash Tables Deep Blue
would you know how many plies would a deep blue with today's hardware get ?mjlef wrote:I sent your question on to Murray Campbell. While we wait, I know that part of the search in Deep Blue was done in software. The software had access to Hash tables. At the end noes of the software search a hardware search would be sent to the special chips. The hardware part of the search did not have Hash tables:hammerklavier wrote:How much Hash tables memory did Deep Blue use?
http://ac.els-cdn.com/S0004370201001291 ... 47f33639d8
-
- Posts: 362
- Joined: Thu Mar 16, 2006 7:39 pm
- Location: Portugal
- Full name: Alvaro Cardoso
Re: Hash Tables Deep Blue
I understand, a single loss to Fritz would make the DB team look so bad, I mean with all that mighty hardware, yes, it would be bad for them.syzygy wrote:DB did not prune beyond alpha/beta and that was a conscious choice. They were afraid of losing a game to Fritz or so due to a low-depth oversight.Cardoso wrote:DB didn't use Nul Move, and probably didn't use LMR and ProbCut.I really wonder what techniques the top engines like stockfish/Komodo use which weren`t used by Deep Blue. Did DB use Null-Move/LMR/ProbCut ? Would be really intresting to know Razz
DB's pruning techniques were less intensive that the ones we have today.
In a speech Murray said DB's branching factor was 3 or 4, so as you can see there was not much intense pruning going on on DB's search.
Of course DB would have been much stronger even if they had only added null-move pruning.
I really wonder if DB had NM, would Kasparov win that single game?
And just imagine if DB had that agressive pruning of stockfish's NM/LMR/LMP.
It makes me wonder what the Kasparov match would have been if DB had these pruning mechanisms.
Nevertheless it was a great chess machine for the time, and it's a pitty so few games were played against it, both with kasparov and other grandmasters and also Fritz. It was a real magician trick IBM made, now you see it, now you don't
-
- Posts: 2821
- Joined: Fri Sep 25, 2015 9:38 pm
- Location: Sortland, Norway
Re: Hash Tables Deep Blue
Was 13 the average depth in the opening / middlegame?
-
- Posts: 6052
- Joined: Tue Jun 12, 2012 12:41 pm
Re: Hash Tables Deep Blue
yeah, at 200 000 000 nodes/sec.kgburcham wrote:basic enough to make a world chess champion cry.Lyudmil Tsvetkov wrote:I do not know about the search in particulars, but while browsing the net, I found a page with Deep Blue evaluation and, next to SF and presumably Komodo, Deep Blue evaluation is fully basic.CheckersGuy wrote:I really wonder what techniques the top engines like stockfish/Komodo use which weren`t used by Deep Blue. Did DB use Null-Move/LMR/ProbCut ? Would be really intresting to knowmjlef wrote:I sent your question on to Murray Campbell. While we wait, I know that part of the search in Deep Blue was done in software. The software had access to Hash tables. At the end noes of the software search a hardware search would be sent to the special chips. The hardware part of the search did not have Hash tables:hammerklavier wrote:How much Hash tables memory did Deep Blue use?
http://ac.els-cdn.com/S0004370201001291 ... 47f33639d8
I do not know if I am right to assume that hash tables with the eval of Deep Blue would perform much worse than hash tables implemented in a much more refined engine(retrieving wrong eval would not be very helpful, you know)
-
- Posts: 362
- Joined: Thu Mar 16, 2006 7:39 pm
- Location: Portugal
- Full name: Alvaro Cardoso
Re: Hash Tables Deep Blue
Yes the software part could do about 12 ply depth for the nominal depth (non extended), however on some forced move positions, extensions could make it go to around ply 40.Nordlandia wrote:Was 13 the average depth in the opening / middlegame?
Their search was a solild one without holes.
Todays searches on top programs, have many holes in them, but they are more than compensated by the increase of general depth.
Null Move, and specially LMR make a lot of assumptions and that can cause some holes in the search, but in practice they allow more general depth and as a consequence ELO increase.
Also move ordering on todays top engines are very good, SF is doing some great move ordering tricks, and this reduces immensely nodecount (for fixed depth searches comparisons) .
The chess chips of DB, didn't have killers and hashtable (so no hash move for move ordering).
Alvaro