Silly discussion imo. You're arguing everything is software, assisted by some hardware improvements. I'm arguing there's another category, improvements brought about BECAUSE the hardware became available. The RAM expensive lookup tables which are integral to SF search, for example. The history and statistic tables (again integral to SF search) worked up in the very fast search space and which are viable ONLY in the very fast search space. Not to mention NNUE, possible only with huge RAM and made viable bysyzygy wrote: ↑Sun Oct 12, 2025 10:07 pmSingle-core speed went up but not massively. From 2005 to 2020 probably less than 10x.
In 2005 I had 2GB of RAM. 512MB should be more than plenty for pre-NNUE SF, certainly in ultrabullet games (from the point of view of SF) against Fruit 2.1. I don't know if it is still escaping your attention that pre-NNUE SF crushes Fruit 2.1 at 1:1000 time odds on equal hardware?
(I think I wrote SF 15 earlier, but the 2020 test reported in the post was with SF-dev from April 29, 2020, i.e. a pre-NNUE development version between SF 11 and SF 12.)
64KB pre-2005? This is not a serious discussion, to put it friendly.Sure, I agree with you, the comparison to 6502 is nonsensical. They were unable to use multiply and forced into imprecision of 8 bits. The programs needed fitting to the very limited instruction set. Better to compare things written in relatively simple C, certainly pre 2005 with a non-NNUE SF forced to the constraints of 64K useable RAM for tables. SF will win probably, there’s been some software progress, but the bulk of the progress is hardware enabled, it wasn’t done before because it could NOT physically be done before.
This is not difficult at all. Progress in software has been absolutely massive. You can just run pre-NNUE SF on an old machine and see for yourself.If this is too difficult for you, consider your own field. Imagine a future 10 man EGTB. Then add massive more RAM availability and build an 11 man table. Is that hardware or software or hardware availability progress?
Of course this has been "enabled" by hardware, in particular:
1. The internet.
2. The possibility to run many fast games to evaluate changes.
Again, nobody here is arguing that programmers today are somehow more talented. Nobody is taking anything away from what the 1980s programmers achieved. Do not worry.
But that does not mean we should remain stuck in the 1980s, nor that the 1980s are the measure of everything. Why the 1980s and not the 1950s?
The point is simply that there has been undeniable, spectacular progress in software.
But of course on Talkchess you will find people denying it anyway.
512 bit parallelisation. So; if you want a software improvement old-new, you must for a fair test, just of algorithmic ideas, imo, operate the test at the old program search speeds with the large RAM and wide register dependant elements of the new program disabled.
Which decade to go back to? Well the OP's here were arguing 1980s and 6502, I'ld be inclined to disagree, 6502 instruction set is crippled and unsuitable. Possibly pre-internet 1990's, when us non mainframe people were working with 68000 asm or with early C and with limited RAM typical for home computers. Try forcing SF into that environment and absent the tuning. You'll find SF better, but you won't find much left after the forced code culling that wasn't there in parallel in those earlier engines.
Most SF improvement is down to the creation of the structured development and testing environment. In other words the progress was good management and overall plan not software. Any clown can do software, the real stars are those who built the circus tent.