SilverStone’s most powerful PSU, the HELA 2050R, just received a 450-watt upgrade. This new model is called the HELA 2500R and has four 12V-2x6 power connectors and a maximum rated continuous power output of 2,500 watts. This means you’ll need a special 16-amp, 240-volt outlet to run this beast, as the average 110-volt household outlet can only accommodate 1,800 watts.
You can attach up to four RTX 4090s to the power supply with the four native 16-pin connectors, making it easier for you run these high-powered devices without using up all the other eight-pin slots. But even if you use up all the 16-pin slots on Nvidia’s top GPUs, the PSU can comfortably run them all as their 1,800 total TDP (at 450 watts each) will still give you a 700-watt headroom to run the rest of your PC. Aside from these four GPU 12V-2x6 slots, you also get four SATA, seven 8-pin PCIe, and the motherboard connectors.
This is SilverStone’s most potent PSU ever, but it’s not the highest output you can buy today. That’s because Super Flower also launched a 2,800-watt PSU that also has the same number of connectors as the HELA 2500R, but with more SATA / PERIF slots.
These new, high-powered PSUs aren’t made for gamers with deep pockets (although they can certainly use them). Instead, these cater for the needs of institutions and professionals that need multiple high-end GPUs for their computing needs, like AI training.
AI processing requires a lot of specialized hardware which you typically won’t find in CPUs. And even though we recently saw the launch of Snapdragon X processors with onboard NPUs that can hit 45 TOPS, Nvidia says that this is not enough performance to handle advanced AI tasks. The graphics card company claims that its GPUs still perform better when it comes to AI processing, that’s why Nvidia is seeing massive revenues and is quickly becoming one of the most valuable companies in the world.
However, graphics cards are power-hungry, meaning companies that want to use on-device AI processing require these monster PSUs to power their workstations. In fact, Meta founder Mark Zuckerberg claims that AI growth will be constrained by power limits. If a business runs this 2,500-watt machine for just 50% of the time, it will consume almost 11,000kWh per year — which is 500kWh more than the average annual U.S. household power consumption.
Of course, any business that invests thousands of dollars in high-powered AI systems will want to get its money’s worth. So, you can expect these PSUs to run 24/7, meaning just one HELA 2500R has the power requirements of two households. And if a company invests in ten of these systems, then we’re looking at 20 houses worth of power consumption going into a premises.
With experts seeing AI technologies and applications consuming a quarter of America’s power production, we might soon see data centers putting up their own nuclear power plants to deliver their electrical needs. Unless the national grid keeps up with the jump in power demand in the near future, the U.S. runs the risk of falling behind other countries in the AI race.