Elon Musk has revealed that Tesla will use hardware sourced not only from Nvidia, but also AMD for its next generation Dojo supercomputers.
Tesla launched its first Dojo ExaPod supercomputer in 2023, which was built using custom AI chips and is reportedly capable of delivering up to 1.1 exaflops of computing power to train the machine learning models in the company's self-driving technology.
Now, in a post on X (formerly Twitter), Musk confirmed that Tesla is intending to spend over a billion dollars on Nvidia's H100 and AMD's Instinct MI300 hardware this year.
Preparing to spend billions
Following a discussion about Tesla’s new $500 million project for the New York Gigafactory, Musk pointed out that while that seems like a lot of money - and it obviously is – that’s only enough to buy 10,000 Nvidia H100 AI GPUs. Tesla, he said, will spend much more than that on Nvidia hardware this year. “The table stakes for being competitive in AI are at least several billion dollars per year at this point,” he said.
When asked about additional potential purchases of AI chips from AMD, Musk confirmed the plan without divulging further details. It's expected that Tesla will buy AI processors from the Instinct MI300 lineup, specifically the Radeon Instinct MI300X which was unveiled alongside the world's first data center APU (Instinct MI300A) in December last year.
While the Instinct MI300X is comparable to the Nvidia H100 in training tasks it can be 1.6 times faster in AI inference scenarios and is designed to scale, supporting up to 192 GB of HBM3 memory.
In December, Tesla’s Dojo plans faced a major setback when project lead Ganesh Venkataramanan left the company. However, the role was swiftly filled when Peter Bannon, a former executive at Apple and a Tesla veteran, stepped in to head the project.
The governor is correct that this is a Dojo Supercomputer, but $500M, while obviously a large sum of money, is only equivalent to a 10k H100 system from Nvidia.Tesla will spend more than that on Nvidia hardware this year. The table stakes for being competitive in AI are at…January 26, 2024