Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tom’s Hardware
Tom’s Hardware
Technology
Anton Shilov

Elon Musk's xAI Colossus 2 is nowhere near 1 gigawatt capacity, satellite imagery suggests — despite claims, site only has 350 megawatts of cooling capacity

XAI Colossus Memphis Supercluster.

Despite Elon Musk's announcement on Monday that xAI's Colossus 2 data center had reached a 1 GW scale, the supercomputer is not even close to that, a satellite image published by Epoch AI researchers reveals.

Based on 550,000 Nvidia Blackwell AI accelerators, xAI's Colossus 2 is advertised as the industry's first AI facility that consumes one gigawatt of power for AI inference and training. But for now, the data center codenamed 'Macrohard' purportedly only has 350 MW of cooling capacity — not nearly enough to cool down 550,000 Blackwell GPUs at full power, even in winter. As a result, Musk's Jan. 19 announcement may have been premature, to put it mildly. Epoch AI expects the supercomputer to reach 1 GW by May.

Interestingly, when Grok, xAI's AI bot, was asked about Colossus 2, it confirmed that the launch of the supercomputer may be phased. Furthermore, it recalled media reports claiming that xAI may be using unpermitted gas turbines for extra power.

At the pace that Colossus 2 is being equipped with cooling systems right now, the new supercomputer will become a gigawatt-class machine sometime in May, according to the research. Meanwhile, the machine was once predicted to use a million GPUs, and then Musk said that it could scale to 1.5 GW or even 2 GW when the time comes. When this could happen is not known because xAI needs to get enough AI servers, procure enough power, and then get cooling systems.

Even though xAI's Colossus 2 will hit its 1 GW milestone later than expected, it is still projected to be ahead of rivals from Amazon and OpenAI, according to a graph by Epoch AI. The company will have more resources for AI training, AI inference, and agentic AI workloads than its rival for some time.

Based on the graph's reference lines, the power consumption of the whole city of San Diego averages ~800 MW, Amsterdam consumes around ~1.6 GW, and the power consumption of Los Angeles is about ~2.4 GW, which puts modern frontier AI data centers in the same class as major cities. When fully equipped and ramped, xAI Colossus 2 at roughly 1.3 GW – 1.4 GW, would consume about 1.7× San Diego's power, slightly less than Amsterdam, and around 60% of Los Angeles.

Follow Tom's Hardware on Google News, or add us as a preferred source, to get our latest news, analysis, & reviews in your feeds.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.