Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Axios
Axios

Nvidia's race to outpace physics

Nvidia's chips are improving at such a staggering pace that it defies any historical comparison.

Why it matters: Without these gains — which are drawing increased attention as AI transforms society — physics would slam the brakes on the data center boom.


Driving the news: Nvidia CEO Jensen Huang said Monday he expects the company to reap "at least" $1 trillion in revenue for its newest chips through 2027.

  • It posted record sales and earnings last month, fueled by skyrocketing orders from Big Tech data center companies.

By the numbers: Nvidia has historically dominated the market. But its cumulative share has dropped from 100% in the first quarter of 2022 to 65% in the fourth quarter of last year, according to the research and consultancy firm SemiAnalysis.

Data: Epoch.ai; Note: Q4 2025 data for Nvidia and Google does not cover the entire quarter; Chart: Axios Visuals

The big picture: The AI boom runs on electricity — and Nvidia's chips determine how far that power goes.

  • Chips — flat, stamp-sized squares — are the beating heart of data centers.
  • Each new generation delivers dramatic gains in performance per chip — even as total AI energy demand keeps surging.

"Chips are being redesigned because efficiency determines how fast intelligence can scale," Huang wrote in a rare blog post published last week.

  • "Energy becomes central because it sets the ceiling on how much intelligence can be produced at all."

Yes, but: Nvidia faces a threat as the industry shifts from one type of AI — training — to another — inference. Its chips are optimized for the former.

  • "All this inference stuff is incredibly threatening to Jensen, because it's all efficiency-driven," Paul Kedrosky, a venture investor and fellow at MIT's Initiative on the Digital Economy, told the WSJ.
  • "He's desperately trying to find a way to extend the franchise into inference."

Zoom out: Energy efficiency has long been a boring but important component of technology.

  • We might think of energy-efficient appliances or a Toyota Prius — saving money and helping the planet.

But that's not what's happening with the AI boom. Energy efficiency is no longer a nice-to-have, it's a must-have.

  • Electricity is physically limited. AI demand appears unlimited.
  • That makes energy efficiency the backbone of society's astonishing growth in AI computing power.

The intrigue: It's like going from a Model T to a Tesla in under a decade — instead of more than a century.

  • If cars' fuel efficiency had improved as swiftly as chips, "we'd be driving to the moon and back in one gallon of gas," said Josh Parker, head of sustainability at Nvidia, the world's leader in AI computation.

Flashback: In 1865, British economist William Stanley Jevons observed that when England made coal steam engines more efficient, they actually used more — not less — coal.

  • This has become known as the Jevons paradox, where energy efficiency — inherently a term to describe saving energy — often creates more demand for energy.

AI is putting the Jevons paradox on steroids.

  • "The absolute footprint of AI, in terms of energy consumption, we do see it growing year over year, and we expect that trend to continue," Parker said.

How it works: Two essential components of a chip's efficiency are its energy consumption and how it's cooled.

  • Physics dictates that electricity powering a chip ultimately becomes heat, which then needs to be cooled down.

Catch up fast: Cooling technologies broadly break down into two ways.

  • Traditional air-cooled data centers often rely on evaporative cooling, which can consume significant amounts of water.
  • Newer liquid-cooled systems can reduce that water use — though total demand still depends on design and location.

"If you have chips and servers, they're useless if you don't have power and you don't have cooling," said Rich Whitmore, who leads Motivair, Schneider Electric's liquid-cooling business, which works with Nvidia.

Zoom in: Each generation of Nvidia's chips — named after famous scientists — posts massive efficiency gains over the last.

  • The chip hitting the market today — called Blackwell — redesigned the whole architecture of computing to get more performance and efficiency, said Dion Harris, senior director of AI infrastructure at Nvidia.

What's next: If we went from a Model T to a Tesla in the past decade, imagining what comes next —in just the next few years — feels almost absurd.

  • "Maybe some kind of hovercraft," Parker said, joking (maybe).
Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.