Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
Ross Kelly

Want your server to access more than 100,000 DIMM slots in one go? This Korean startup claims that is CXL 3.1-based technology can help you scale to more than 100PB of RAM — but it will cost nearly $5 billion

V-color RGB DDR5 O CUDIMM memory sticks lying flat on a surface.

Ever imagined drawing on up to 100 petabytes of RAM? Well, this startup could be the key to unlocking groundbreaking memory capabilities.

Korean fabless startup Panmnesia unveiled what it described as the world’s first CXL-enabled AI cluster featuring 3.1 switches during the recent 2024 OCP Global Summit.

The solution, according to Panmnesia, has the potential to markedly improve the cost-effectiveness of AI data centers by harnessing Compute Express Link (CXL) technology.

Scalable - but costly

In an announcement, the startup revealed the CXL-enabled AI cluster will be integrated within its main products, the CXL 3.1 switch and CXL 3.1 IP, both of which support the connections between the CXL memory nodes and GPU nodes responsible for storing large data sets and accelerating machine learning.

Essentially, this will enable enterprises to expand memory capacities by equipping additional memory and CXL devices without having to purchase costly server components.

The cluster can also be scaled to data center levels, the company said, thereby reducing overall costs. The solution also supports connectivity between different types of CXL devices and is able to connect hundreds of devices within a single system.

The cost of such an endeavor could be untenable

While drawing upon 100PB of RAM may seem like overkill, in the age of increasingly cumbersome AI workloads, it’s not exactly out of the question.

In 2023, Samsung revealed it planned to use its 32GB DDR5 DRAM memory die to create a whopping 1TB DRAM module. The motivation behind this move was to help contend with increasingly large AI workloads.

While Samsung is yet to provide a development update, we do know the largest RAM units Samsung has previously produced were 512GB in size.

First unveiled in 2021, these were aimed for use in next-generation servers powered by top of the range CPUs (at least by 2021 standards - including the AMD EPYC Genoa CPU and Intel Xeon Scalable ‘Sapphire Rapids’ processors.

This is where cost could be a major inhibiting factor with the Panmnesia cluster, however. Pricing on comparable products, such as the Dell 370-AHHL memory modules at 512GB, currently stands at just under $2,400.

That would require significant investment from an enterprise by any standards. If one were to harness Samsung’s top end 1TB DRAM module, the costs would simply skyrocket given their expected price last year stood at around $15,000.

More from TechRadar Pro

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.