AI processing can take a huge amount of computing power, but by the looks of this latest joint project from the Jülich Supercomputing Center and French computing provider Eviden, power will not be in short supply. The two companies have signed a deal to build a new data centre to house an exascale-class supercomputer that they say will be capable of performing one quintillion floating-point operations per second, or one exaFLOPS in HPC (high performance computing) output.
The JUPITER supercomputer (if you're wondering, the Joint Undertaking Pioneer for Innovative and Transformative Exascale Research, which as backronyms go is spectacular) was commissioned by Europe's supercomputer consortium, EuroHPC JU, (via Techspot) in October 2023, and the first of three planned to be built will be contained in the new facility in Germany, with plans to be operational inside a year.
JUPITER will be powered by multiple Nvidia GH200 Grace Hopper Superchips, will be put to work primarily on AI training and is set become the world's most powerful AI system, with a staggering 90 exaFLOPs of performance expected when set to work on training AI models.
Built around 50 container modules installed over 2,300 square meters, the modular approach taken to house this monster machine is hoped to cut the delivery time of the project by 50 percent, and should mean that future upgrades will be easier to implement over time.
The containers will include 20 IT units, 15 power and 10 logistics modules. The hope is that the JUPITER supercomputer contained within will be operational in under a year and delivered for a much lower cost, thanks to the modular nature of the site.
If construction goes ahead as planned, this will be the first exascale-class supercomputer in Europe, however the title of world's fastest supercomputer in terms of raw HPC speed is hotly contested.
Best gaming PC: The top pre-built machines.
Best gaming laptop: Great devices for mobile gaming.
The current holder of the title is the Hewlett-Packard Enterprise Frontier, a machine capable of 1.102 exaFLOPS, though it may not stay at the top spot for much longer. El Capitan is currently under construction at Lawrence Livermore National Laboratory in California. It's designed to capable of 2 exaFLOPS when completed, which is currently estimated to be sometime in mid-2024.
Nevertheless, the figures for this new European supercomputer look impressive, and if this new modular construction approach is successful it may serve as a model for speeding up the implementation of similar projects yet to begin construction, like the proposed exascale system currently planned in Edinburgh.
As we move into a world where huge amounts of processing power become not just a nicety, but a necessity for AI development, quantum computing and more, it seems like construction techniques are beginning to advance in tandem with the power of the computers themselves.
Now, who's going to let us benchmark one? I already have some ideas…