Page 10 of 39

Re: Alphazero news

Posted: Sat Dec 08, 2018 12:45 pm
by matthewlai
glennsamuel32 wrote: Sat Dec 08, 2018 6:27 am Matthew, could you divulge the size of the network file that A0 used ?
The details are in supplementary materials:
Architecture
Apart from the representation of positions and actions described above, AlphaZero uses the
same network architecture as AlphaGo Zero (9), briefly recapitulated here.
The neural network consists of a “body” followed by both policy and value “heads”. The
body consists of a rectified batch-normalized convolutional layer followed by 19 residual blocks (48).
Each such block consists of two rectified batch-normalized convolutional layers with a skip connection.
Each convolution applies 256 filters of kernel size 3 ⇥ 3 with stride 1. The policy head
applies an additional rectified, batch-normalized convolutional layer, followed by a final convolution
of 73 filters for chess or 139 filters for shogi, or a linear layer of size 362 for Go,
representing the logits of the respective policies described above. The value head applies an
additional rectified, batch-normalized convolution of 1 filter of kernel size 1 ⇥ 1 with stride 1,
followed by a rectified linear layer of size 256 and a tanh-linear layer of size 1.

Re: Alphazero news

Posted: Sat Dec 08, 2018 1:16 pm
by Rein Halbersma
So why is it that A0's learning curve seems to flatten to almost no progress beyond it's current level? If e.g. the number of layers or channels is expanded, would you expect that a few hundred Elo more could be obtained? Or is A0 approaching perfection with its current network and is an absolute upperbound of Elo in sight?

Re: Alphazero news

Posted: Sat Dec 08, 2018 1:38 pm
by nabildanial
matthewlai wrote: Sat Dec 08, 2018 12:45 pm
glennsamuel32 wrote: Sat Dec 08, 2018 6:27 am Matthew, could you divulge the size of the network file that A0 used ?
The details are in supplementary materials:
Architecture
Apart from the representation of positions and actions described above, AlphaZero uses the
same network architecture as AlphaGo Zero (9), briefly recapitulated here.
The neural network consists of a “body” followed by both policy and value “heads”. The
body consists of a rectified batch-normalized convolutional layer followed by 19 residual blocks (48).
Each such block consists of two rectified batch-normalized convolutional layers with a skip connection.
Each convolution applies 256 filters of kernel size 3 ⇥ 3 with stride 1. The policy head
applies an additional rectified, batch-normalized convolutional layer, followed by a final convolution
of 73 filters for chess or 139 filters for shogi, or a linear layer of size 362 for Go,
representing the logits of the respective policies described above. The value head applies an
additional rectified, batch-normalized convolution of 1 filter of kernel size 1 ⇥ 1 with stride 1,
followed by a rectified linear layer of size 256 and a tanh-linear layer of size 1.
I think what glenn meant by the question is how big the filesize is as in MB.

Re: Alphazero news

Posted: Sat Dec 08, 2018 1:48 pm
by Alexander Schmidt
nabildanial wrote: Sat Dec 08, 2018 12:39 pm
Alexander Schmidt wrote: Sat Dec 08, 2018 12:29 pm Maybe people will go to jail because an AI thinks he will someday do a crime. One day autonomous robots will decide which person to kill, on a battlefield or to prevent a possible crime. Maybe one day an AI will press the red button.
We have the so-called "Ethics of artificial intelligence" to prevent those things from happening.
These things already happen:
http://www.israeltoday.co.il/NewsItem/t ... fault.aspx
https://www.datanami.com/2017/07/17/neu ... ion-banks/

Re: Alphazero news

Posted: Sat Dec 08, 2018 1:54 pm
by jp
nabildanial wrote: Sat Dec 08, 2018 1:38 pm
matthewlai wrote: Sat Dec 08, 2018 12:45 pm
glennsamuel32 wrote: Sat Dec 08, 2018 6:27 am Matthew, could you divulge the size of the network file that A0 used ?
The details are in supplementary materials
I think what glenn meant by the question is how big the filesize is as in MB.
Yep. How big is the filesize in MB?

Re: Alphazero news

Posted: Sat Dec 08, 2018 1:57 pm
by shrapnel
Any Plans to Commercialize AlphaZero ?
Can't wait to get my hands on AlphaZero engine no matter what the Price.

Re: Alphazero news

Posted: Sat Dec 08, 2018 2:55 pm
by matthewlai
jp wrote: Sat Dec 08, 2018 1:54 pm
nabildanial wrote: Sat Dec 08, 2018 1:38 pm
matthewlai wrote: Sat Dec 08, 2018 12:45 pm
glennsamuel32 wrote: Sat Dec 08, 2018 6:27 am Matthew, could you divulge the size of the network file that A0 used ?
The details are in supplementary materials
I think what glenn meant by the question is how big the filesize is as in MB.
Yep. How big is the filesize in MB?
Not sure. We don't store the networks locally. They are just TensorFlow SavedModels. If you construct the same network and save it you'll get the same size. https://www.tensorflow.org/guide/saved_model

Re: Alphazero news

Posted: Sat Dec 08, 2018 3:36 pm
by Jouni
Of course Alphazero is stunning achievement! But so far not better than Stockfish only alternative approach. BTW in 22 move loss game A0 is already lost after 12.-Nb6 based my short analysis. May be 12. -Nf8 can still save game?

Re: Alphazero news

Posted: Sat Dec 08, 2018 4:57 pm
by corres
Astatos wrote: Fri Dec 07, 2018 1:01 pm OK what we know :
...
2) LC0 guys did manage to reverse engineer A0 successfully
...
Do you think the developers of LC0 can get the binaries used by the team of A0?
They only try to develop an NN based chess engine from the rather incomplete information made public
about A0 by the team of A0.
They are working hard and well and to now the chess power of LC0 comes near to A0.
A lot of thanks to the team of LC0 and their voluntary helpers!
I think maybe the results of the developers of LC0 forces the team of A0 to come to public more information about A0.

Re: Alphazero news

Posted: Sat Dec 08, 2018 5:02 pm
by corres
shrapnel wrote: Sat Dec 08, 2018 1:57 pm Any Plans to Commercialize AlphaZero ?
Can't wait to get my hands on AlphaZero engine no matter what the Price.
You should become the main share-holder of Google to get that engine...