Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
Keumars Afifi-Sabet

Dell cosies up with AMD to expand its generative AI portfolio to offer more choice to customers — it's also embracing standards-based networking, a thinly veiled dig at Nvidia

Dell PowerEdge XE9680.

Dell has added an AMD-powered server to its ranks of high-performance computing portfolio for AI workloads.  

Alongside an Nvidia-powered version, customers can soon snap up a new version of the Dell PowerEdge XE9680 fitted with eight AMD Instinct MI300X accelerators. This helps businesses train and run their own in-house large language models (LLMs) – offering 1.5GB of high-bandwidth memory (HBM3) and over 21 petaFLOPS of performance. 

Customers can also scale systems they deploy with the global memory interconnect (xGMI) standard, and connect AMD's GPUs over an Ethernet-based AI fabric with a Dell PowerSwitch Z9664F-ON. It follows Dell's launch of the unit fitted with Nvidia H100 GPUs earlier this year.

Dell adopts an open approach to AI infrastructure

This comes alongside a new standard called Dell Validated Design for Generative AI with AMD. It's a mouthful, but is essentially a framework for organizations wishing to run their own hardware and networking architecture to run LLMs. 

Dell customers could use the solution to build generative AI platforms with various frameworks that guide the integration of the technology as well as physical installation and optimization for performance.  

It also uses AMD ROCm powered AI frameworks, which is an open source package of drivers, dev toolkits and APIs compatible with AMD Instinct accelerators. They include popular frameworks like PyTorch, TensorFlow and OpenAI Triton – which all have native support on the PowerEdge XE9680 that's fitted with the AMD accelerators. 

Dell's continued push for standards-based networking, as a member of the Ultra Ethernet Consuortium (UEC), means it's adopted a more open approach to that adopted by Nvidia. 

Although an AI infrastructure industry leader, Nvidia differs from AMD in that the latter advocates for an open Ethernet for AI – in which switches from different vendors can interoperate together on the same system. Likewise, Dell wants businesses to take an open approach across the compute, fabric as well as storage components needed to power generative AI models in-house.

The new hardware and services that comprise Dell's latest AI push are set to be available during the first half of next year, the company said. 

More from TechRadar Pro

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.