Anthropic CEO Dario Amodei said in the In Good Company podcast that AI models in development today can cost up to $1 billion to train. Current models like ChatGPT-4o only cost about $100 million, but he expects the cost of training these models to go up to $10 or even $100 billion in as little as three years from now.
"Right now, 100 million. There are models in training today that are more like a billion." Amodei also added, "I think if we go to ten or a hundred billion, and I think that will happen in 2025, 2026, maybe 2027, and the algorithmic improvements continue a pace, and the chip improvements continue a pace, then I think there is in my mind a good chance that by that time we'll be able to get models that are better than most humans at most things."
The Anthropic CEO mentioned these numbers when he discussed the development of AI from generative artificial intelligence (like ChatGPT) to artificial general intelligence (AGI). He said that there wouldn't be a single point where we suddenly reach AGI. Instead, it would be a gradual development where models build upon the developments of past models, much like how a human child learns.
So, if AI models grow ten times more powerful each year, we can rationally expect the hardware required to train them to be at least ten times more powerful, too. As such, hardware could be the biggest cost driver in AI training. Back in 2023, it was reported that ChatGPT would require more than 30,000 GPUs, with Sam Altman confirming that ChatGPT-4 cost $100 million to train.
Last year, over 3.8 million GPUs were delivered to data centers. With Nvidia's latest B200 AI chip costing around $30,000 to $40,000, we can surmise that Dario's billion-dollar estimate is on track for 2024. If advancements in model/quantization research grow at the current exponential rate, then we expect hardware requirements to keep pace unless more efficient technologies like the Sohu AI chip become more prevalent.
We can already see this exponential growth happening. Elon Musk wants to purchase 300,000 B200 AI chips, while OpenAI and Microsoft are reportedly planning a $100 billion AI data center. With all this demand, we could see GPU data center deliveries next year balloon to 38 million if Nvidia and other suppliers can keep up with the market.
However, aside from the supply of the actual chip hardware, these AI firms need to be concerned with power supply and related infrastructure, too. The total estimated power consumption of all data center GPUs sold just last year could power 1.3 million homes. If the data center power requirements continue to grow exponentially, then it's possible that we could run out of enough economically-priced electricity. Furthermore, while these data centers need power plants, they also need an entirely upgraded grid that can handle all the electrons the power-hungry AI chips need to run. For this reason, many tech companies, including Microsoft, are now considering modular nuclear power for their data centers.
Artificial intelligence is quickly gathering steam, and hardware innovations seem to be keeping up. So, Anthropic's $100 billion estimate seems to be on track, especially if manufacturers like Nvidia, AMD, and Intel can deliver. However, as our AI technologies perform exponentially better every new generation, one big question still remains: how will it affect the future of our society?