U.S. President Joe Biden's administration is looking to fast-track talks with tech companies as their demands for electricity to power artificial intelligence data centers continue to grow. Axios reports that this may include more nuclear power, and discussed the issue with Energy Secretary Jennifer Granholm.
The increasing need for power is a "problem" Granholm told Axios, clear to distinguish that demand from AI itself. "AI itself isn't a problem, because AI could help to solve the problem," Granholm said last week, according to the publication.
One solution to addressing the power needs may be nuclear power. The report suggests that the Department of Energy is considering an idea that tech companies with huge data centers powering AI models could potentially put "small nuclear plants" nearby.
In 2023, about 18.6% of electricity in the U.S. was generated from nuclear energy, according to the U.S. Energy Information Administration. Last week, Granholm was in Michigan following the Energy Department's approval of a $1.52 billion loan to restart a nuclear power plant. If the DOE can get tech companies to jump on board nuclear power, it could both address the needs for AI computing as well as accelerate the use of clean energy in the United States.
Some of these big tech companies have already started investing heavily in nuclear fusion — yes, fusion, even though at present no commercially viable fusion reactors exist. Last year, Microsoft signed a deal to purchase power from a nuclear fusion generator by Helion Energy. Considering that Microsoft and OpenAI have reportedly been discussing a new supercomputer that could consume "at least several gigawatts" of power, the need for clean, accessible power is clearly a priority.
Axios suggests another option is nuclear fission, using small modular reactors. That can be an expensive option, however, and Granholm said that the DOE is "trying to crack the code" to lower costs and make it so companies are more willing to consider the reactors.
Companies like Microsoft, Google, OpenAI, Amazon, and Meta clearly aren't slowing down building datacenters and focusing on AI research as it becomes the next big thing. Nvidia and AMD chips, as well as custom chips from other companies, require plenty of power. Nvidia's next-gen Blackwell GB200 NVL72 for example could consume well over 100kW per rack. When tens of thousands — or even millions — of chips are stuck in a single data center, megawatts and even gigawatts of accessible power may be required. The need for power definitely isn't going away.
Get all your news in one place.
100’s of premium titles.
One app.
Start reading
One app.
Get all your news in one place.
100’s of premium titles. One news app.
US govt wants to talk to tech companies about AI electricity demands — eyes nuclear fusion and fission
United States
Electricity
Jennifer Granholm
Joe Biden
Axios
Microsoft
Nuclear Power
DOE (Organization)
Nvidia
OpenAI
Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member?
Sign in here
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member?
Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member?
Sign in here
Our Picks