Time to rethink what Albert Silver has done?

Discussion of anything and everything relating to chess playing software and machines.

Moderator: Ras

noobpwnftw
Posts: 694
Joined: Sun Nov 08, 2015 11:10 pm
Full name: Bojun Guo

Re: Time to rethink what Albert Silver has done?

Post by noobpwnftw »

it was a dead end
And it is dead.

Time to rethink? If it was just a translation into Dutch, it would've been stronger.
Raphexon
Posts: 476
Joined: Sun Mar 17, 2019 12:00 pm
Full name: Henk Drost

Re: Time to rethink what Albert Silver has done?

Post by Raphexon »

Albert Silver wrote: Sun Jul 04, 2021 12:21 am
Sopel wrote: Sat Jul 03, 2021 1:44 pm
smatovic wrote: Sat Jul 03, 2021 11:48 am Yo, into wasps' nest, I am not into the details, but I read that Stockfish uses now Lc0 data for training, and doubled their net size. Time to rethink what Albert Silver has done? As far as I got it he used FatFritz 1 (Lc0 derivative) data for training and doubled the net size of the FatFritz 2 (Stockfish derivative) network.

--
Srdja

PS: not interested in discussing the marketing of ChessBase for FF2.
1. He used leela (FF1) data because he had
... the idea of converting it into NNUE usable data to train a network. He then rented 10 2080ti GPUs on Vast.ai for several months (at a cost of thousands of dollars) to generate the data needed to do this, since of course the FF1 data he had was completely insufficient as maybe only 300 thousand games were from the final net's full strength, the rest was much weaker, and therefore not of interest. Except to test the concept.
It just so happens that we independently discovered that lc0 data works well in our case.
A miracle! :lol:
2. Trying to increase the net size is a no brainer once the training and testing procedure is established and
... demonstrated by someone else first.
This was pretty much known from the beginning, as jjosh was training larger nets well before most people even knew about NNUE.
Quite true, and his result was some 50 Elo weaker. As a result, the popular belief was that it was a dead end as was repeatedly told anyone asking about larger net sizes in the SF Discord. Vondele's attempt to reproduce my result was also much weaker, but of course I was using higher quality data.

Still, as I mentioned elsewhere, I felt it would inspire others to try and of course improve my ideas and the proof is there. I'm genuinely glad.
How was Vondele's attempt "much weaker" when it produced nigh-identical results in H2H matches vs SFdev?

You also really can't stop lying, can you?
AndrewGrant
Posts: 1960
Joined: Tue Apr 19, 2016 6:08 am
Location: U.S.A
Full name: Andrew Grant

Re: Time to rethink what Albert Silver has done?

Post by AndrewGrant »

Albert Silver wrote: Sun Jul 04, 2021 12:21 am
Sopel wrote: Sat Jul 03, 2021 1:44 pm
smatovic wrote: Sat Jul 03, 2021 11:48 am Yo, into wasps' nest, I am not into the details, but I read that Stockfish uses now Lc0 data for training, and doubled their net size. Time to rethink what Albert Silver has done? As far as I got it he used FatFritz 1 (Lc0 derivative) data for training and doubled the net size of the FatFritz 2 (Stockfish derivative) network.

--
Srdja

PS: not interested in discussing the marketing of ChessBase for FF2.
1. He used leela (FF1) data because he had
... the idea of converting it into NNUE usable data to train a network. He then rented 10 2080ti GPUs on Vast.ai for several months (at a cost of thousands of dollars) to generate the data needed to do this, since of course the FF1 data he had was completely insufficient as maybe only 300 thousand games were from the final net's full strength, the rest was much weaker, and therefore not of interest. Except to test the concept.
It just so happens that we independently discovered that lc0 data works well in our case.
A miracle! :lol:
2. Trying to increase the net size is a no brainer once the training and testing procedure is established and
... demonstrated by someone else first.
This was pretty much known from the beginning, as jjosh was training larger nets well before most people even knew about NNUE.
Quite true, and his result was some 50 Elo weaker. As a result, the popular belief was that it was a dead end as was repeatedly told anyone asking about larger net sizes in the SF Discord. Vondele's attempt to reproduce my result was also much weaker, but of course I was using higher quality data.

Still, as I mentioned elsewhere, I felt it would inspire others to try and of course improve my ideas and the proof is there. I'm genuinely glad.
Are you really acting like there was some big movement of people going "No! Its not possible! You could not possibly increase the size of the Network", and then you, the bold, brave, defiant Alberto showed the world what they refused to believe? Delusional.
smatovic
Posts: 3450
Joined: Wed Mar 10, 2010 10:18 pm
Location: Hamburg, Germany
Full name: Srdja Matovic

Re: Time to rethink what Albert Silver has done?

Post by smatovic »

Alright guys, I see, this is about Ego, hence - personal thread Nazi moderation: we had our 5 minutes, let's lock the thread and move on.

--
Srdja
User avatar
hgm
Posts: 28419
Joined: Fri Mar 10, 2006 10:06 am
Location: Amsterdam
Full name: H G Muller

Re: Time to rethink what Albert Silver has done?

Post by hgm »

[Moderation] Locked at the request of the OP.