AI GPU development is running at breakneck speeds, and Elon Musk wants to be at the forefront of the revolution. In an X post, the SpaceX and Tesla CEO revealed that he wants to buy 300,000 worth of Nvidia's latest Blackwell B200 GPUs by next summer. The new GPUs will upgrade X's existing AI GPU cluster, which currently consists of 100,000 previous-generation H100 GPUs.
100,000 H100 GPUs is already an enormous amount of computing power, but Elon states that given the pace of AI GPU development, it's not worth keeping around X's massive array of H100 GPUs for long, mainly due to its energy consumption of 1 gigawatt.
Given the pace of technology improvement, it’s not worth sinking 1GW of power into H100s.The @xAI 100k H100 liquid-cooled training cluster will be online in a few months. Next big step would probably be ~300k B200s with CX8 networking next summer.June 2, 2024
X uses the massive array of AI GPUs for Grok, an AI bot. The AI was developed on a homebrewed language dubbed Grok-1 that is geared to provide less straightforward answers as well as witty/comedic answers compared to ChatGPT/Copilot and Gemini. Basically, it is trying to take the "robot" aspect out of the AI bot. The new AI bot is available to X users right now; however, you'll need to be an X Premium user to gain access to the AI bot.
Musk's logic has merit. The AI GPU development race is one of the most heated races we've seen in years in the technological industry, rivaling the CPU development wars we had in the 1990s and 2000s. Nvidia's new Blackwell B200 is a massive upgrade over the H100, offering four times the training performance and 30 times the inference performance.
Technically, the B200 does consume more power. However, the B200's colossal performance improvements mean the chip runs significantly more efficiently than the H100. In Musk's case, trading 100,000 H100s for three times more GPUs that consume even more power is still a net win due to the GPUs' incredible amount of additional AI performance.
It'll be interesting to see when Musk does get his hands on all 300,000 B200 GPUs. If Nvidia's H100 has taught us anything, it's that its AI GPU demand always outstrips actual supply. We will probably see a repeat of 2023, when all the big AI customers, including X, Meta, Google, Microsoft, and others, are fighting to grab as many B200s as Nvidia can pump out for at least the next several months.