In an interview cited by The Wall Street Journal earlier this week, Rene Has, CEO of Arm, warned of AI's "insatiable" thirst for electricity, stating an increase to as much as 25% of the U.S.' current 4% power grid usage from AI datacenters is possible.
Rene himself may have been citing an International Energy Agency report from January stating that ChatGPT consumes roughly 2.9 watt-hours of electricity per request, which is 10 times as much as a standard Google search. Thus, if Google made the full hardware and software switch with its search engine, Google would consume at least 11 terawatt-hours of electricity per year from its current 1 TWh.
The original report says one example of a standard 2.9-watt-hour would be running a 60-watt-hour lightbulb for just under three minutes. Similar to the standard deviation of ChatGPT queries to standard search engines, industry-wide expectations for Artificial Intelligence power demands are expected to increase tenfold.
These statements were made ahead of an expected U.S. and Japanese partnership in AI and alongside recent developments like OpenAI's Sora, the current version of which Factorial Funds estimates to consume at least one Nvidia H100 GPU per hour to generate five minutes of video. Grok 3 has also been estimated to require 100,000 Nvidia H100s just for training. A single, 700-watt Nvidia H100 can consume roughly 3740 kilowatt-hours per year.
Without great improvements to efficiency and/or greatly increased government regulation, Rene declares the current trend is "hardly very sustainable," and he might be correct.
The US Energy Information Administration (EIA) stated that the United States generated a total of 4.24 trillion kilowatt-hours, or 4240 terawatt-hours, in 2022, with only 22% of that coming from renewables. This is compared to a total consumption of 3.9 trillion kWh, or 3900 terawatt-hours of the available ~42.
That's 11 of the 340 remaining terawatt-hours left at current levels that the AI industry seems to be aiming for in the next decade. Sustainability must also keep in mind the likely increasing demands of other industries and the scale of renewable to non-renewable resources. Given that the cost of power has nearly doubled since 1990 (per Statista), perhaps calls for more regulation are justified.
Of course, outlets like The New York Times are also outright suing OpenAI and Microsoft, so it's not like the current AI industry is without existing legal challenges. Rene Haas expressed hope that the international partnership between Japan and the U.S. may yet improve these dramatically high power estimations. However, corporate greed and compute demand are also international, so only time will tell.