With the launch of DGX Cloud, Nvidia wants to bring AI development to firms everywhere at a fair price, offering up immediate access to the infrastructure and software required for advanced model training.
DGX Cloud is hoped to prove a valuable tool for those seeking to train their own generative AI, and is the evolution of the same tool used by OpenAI to train the hugely popular ChatGPT AI writer.
According to the company's press release, every enterprise will be able to access its own AI supercomputer through a web browser, eliminating the costly need to acquire specific on-prem hardware.
Nvidia DGX Cloud for AI training
Nvidia CEO Jensen Huang described this era as “the iPhone moment of AI” during which companies are scrambling to be the first to offer something new. Monthly rent for DGX Cloud clusters provides a more cost-effective and accessible format for startups to get on board.
Each instance of DGX Cloud uses eight H100 or A100 Tensor Core GPUs with 80GB of VRAM, accounting for a total of 640GB of GPU memory per node.
In due course, the company hopes to be able to offer its tools through various cloud service providers, however the service initially launches with Oracle Cloud Infrastructure which will enable supercluster scaling of more than 32,000 GPUs.
Microsoft Azure is lined up to host DGX Cloud as early as next quarter, with Google Cloud and more to follow later on.
Biotechnology company Amgen, insurance technology company CCC Intelligent Solutions, and digital business platform provider ServiceNow are cited as some of DGX Cloud’s earliest adopters.
Pricing has already been announced, with individual instances starting at $36,999 per month in the US, though those looking to access DGX Cloud through anything other than Oracle will need to wait. Similarly, details on official pricing in other regions including the UK and Australia are yet to be announced.
- Our roundup of the best small business servers