GPU rumors 2021

Discussion of anything and everything relating to chess playing software and machines.

Moderator: Ras

Vinvin
Posts: 5287
Joined: Thu Mar 09, 2006 9:40 am
Full name: Vincent Lejeune

Re: GPU rumors 2021

Post by Vinvin »

New video about Google’s NEW Ironwood Chip :
User avatar
towforce
Posts: 12333
Joined: Thu Mar 09, 2006 12:57 am
Location: Birmingham UK
Full name: Graham Laight

Re: GPU rumors 2021

Post by towforce »

Vinvin wrote: Wed May 14, 2025 8:27 pm New video about Google’s NEW Ironwood Chip :

Impressive: a pod of Ironwood TPUs runs at 42.5 exaflops. For perspective, here are other devices at that performance level - link.

According to the video, the use case for this device is not training models, but running trained models at high speed and low power consumption.

I've come up with an expression to help to remember the name: Ironwood: it's not made of iron, and there's no wood in it.
Human chess is partly about tactics and strategy, but mostly about memory
User avatar
towforce
Posts: 12333
Joined: Thu Mar 09, 2006 12:57 am
Location: Birmingham UK
Full name: Graham Laight

Re: GPU rumors 2021

Post by towforce »

Just been searching for a few scraps of information: the thing that makes Ironwood particularly good for inference (but probably doesn't help much for training) is that it has built in features to support efficient processing of sparse models. This is not new, though: their TPU v5 (v5e) from 2023 had this, and the "GPU du jour", the NVidia H100, is able to use sparsity features in its tensor cores with some models.
Human chess is partly about tactics and strategy, but mostly about memory
smatovic
Posts: 3220
Joined: Wed Mar 10, 2010 10:18 pm
Location: Hamburg, Germany
Full name: Srdja Matovic

Re: GPU rumors 2021

Post by smatovic »

smatovic wrote: Tue Dec 17, 2024 6:34 am One on GPU clusters/grids...

Titan, supercomputer from 2012 to 2019 had 18,688 Nvidia K20X GPUs:
https://en.wikipedia.org/wiki/Titan_(supercomputer)

Summit, supercomputer from 2019 has 27,648 Nvidia V100 GPUs:
https://en.wikipedia.org/wiki/Summit_(supercomputer)

Folding@Home, grid, had over 650,000 GPUs in 2020, with >1 exaFLOPS (32-bit?) compute
https://archive.ph/20200412111010/https ... ome.org/os

Frontier, supercomputer from 2022 (first exaFLOPS, #2 on top500 11/2024) has 37,888 AMD MI250X GPUs:
https://en.wikipedia.org/wiki/Frontier_(supercomputer)

Colossus, supercomputer from 2024 by xAI (for the Grok AI) has currently 100,000 Nvidia HGX H100s GPUs (with estimated 100MW power usage), intended to be doubled to 200K.
https://www.tomshardware.com/desktops/s ... ts-secrets
https://www.youtube.com/watch?v=Jf8EPSBZU7Y

Lc0, grid, has currently 17 active, daily contributors ;)
https://training.lczero.org/active_users

--
Srdja
Colossus AI Hits 200,000 GPUs as Musk Ramps Up AI Ambitions
https://www.hpcwire.com/2025/05/13/colo ... ambitions/
xAI has publicly stated plans to expand its Colossus supercomputer to over 1 million GPUs.
The road to powering a million GPUs started when Musk founded xAI in July 2023. The stated goal of “understanding the true nature of the universe.” In more practical terms, Musk wanted an AI lab under his own direction, free from the influences of Microsoft, Google, or other major tech firms.
But honey, can it play chess?

--
Srdja