Get all your news in one place.
100’s of premium titles.
One app.
Start reading
GamesRadar
GamesRadar
Technology
Duncan Robertson

Micron wants your next GPU to have 96GB of VRAM in it, but I don't really know who it's expecting will make it for you

A review photo of Crucial's DDR5 Pro RAM next to an RTX 5080 review image.

"The next era of PC performance will be defined not by more compute, but by memory scale", begins a new blog post from former creator of the best RAM for gaming, Micron.

The blog post really begins as it means to go on, and by that I mean, it's horrifically tone-deaf. For context, we're in the middle of a global RAM shortage, and about two months ago, Micron decided to shut down its consumer brand, Crucial, meaning gamers and PC builders now have even fewer products to try and get their hands on.

The blog post, entitled "The new performance bottleneck: How more GPU memory unlocks next-gen gaming and AI PCs," essentially shines a light on the importance of memory for consumers who are trying to get true next-generation performance out of their machines. It talks about Micron's latest evolution of GDDR7 VRAM density and bandwidth in the best graphics cards, and it's pretty much Micron admitting that RAM and VRAM are going to be pivotal for the best gaming PCs going forward:

(Image credit: Future / Duncan Robertson)

"Memory capacity and efficiency now determine how smoothly a system can deliver immersive gameplay, intelligent creation tools, and real time simulation, making memory a foundational enabler of next generation visual computing," the blog post summizes.

If you were in any doubt about how screwed up the current memory demand situation is, it's being heavily speculated that when Nvidia launches its next generation of graphics cards, they won't even be supplying the VRAM for them for board partners (Asus, Gigabyte, PNY, MSI, and other companies who manufacutre GPUs) to sell onward to consumers. That will almost certainly mean less VRAM in each new GPU SKU, but it will also mean the cost of manufacturing will land at the door of those smaller brands instead of Nvidia, and consumers will then need to pay even more for a new graphics card.

(Image credit: Future / Phil Hayton)

To be frank, Micron's blog post seems completely deluded and reads more like a B2B appeal to discrete GPU makers to use its VRAM over Samsung and SK Hynix. For us consumers, it's talking about a future that no one will be able to afford, given the current climate: "Micron’s new 24Gb density enables up to 96GB of graphics memory, giving GPUs significantly more space for high-resolution textures, expansive worlds, and advanced visual effects", it continues.

Again, just to pull our feet back down to earth before we start dreaming up a reality where 96GB of VRAM is anywhere close to a standard for graphics cards, let me put things in context. The RTX 5080, one of Nvidia's highest-end GPUs from this generation, only has 16GB of GDDR7 VRAM. The RTX 5070 mid-ranger came under so much controversy for only having 12GB. The AMD Radeon RX 7900 XTX and RTX 4090 came out before all of that and went extremely gungho for native performance. They only have 24GB of VRAM.

(Image credit: Nvidia)

In its latest range of 50 Series GPUs, Nvidia made it seem like it was fighting tooth and nail to part with as little VRAM as possible, which resulted in an emphasis on DLSS upscaling, anger over "fake frames", and a lot of disgruntled PC builders switching to AMD graphics cards.

In other words, the thought of a GPU that uses even 30GB of VRAM is laughable on its own, and that's before you take into account that this is being dreamed up by Micron, the brand that abandoned 30 years' worth of consumer loyalty to make a quick buck building data centers for AI companies.

And none of that mentions the other elephant in the room, which is that hardly any games today actually utilize that sort of VRAM. Maybe they would if the majority of game optimization today wasn't ignored in favor of AI upscaling slapping a bandage over performance issues, but that isn't the case. Most games today are horrendously optimized for PC, and my personal take is that it's because DLSS and FSR are a much less costly way to ensure a game runs well without the need for launch delays and polish time for game devs.

(Image credit: Future / Duncan Robertson)

But Micron doesn't seem to realize that: "To keep these visual pipelines running efficiently, the memory subsystem must deliver data rapidly and consistently," the blog post says. Yes, Micron, couldn't agree more, but even in an ideal world where Nvidia wants to give us more native performance, who is going to make that memory for consumers while you're busy servicing AI companies?

Micron seems to be particularly idealistic about the future of computing technology. The other week, another Micron blog post detailed that the company is working on producing the world's first Gen 6 SSD. Of course, this is only aimed at AI data centers for now. I've reviewed multiple Gen 5 SSDs, and for years, those have still been pointlessly fast (and expensive) for the majority of applications and games. We're not even close to that type of storage being the norm for consumers yet, and Crucial seems intent on moving onto the next big thing.

Well, Micron, it's nice to dream, but as someone currently covering a memory shortage which likely won't be going away for the next three years, I'd suggest we maybe keep expectations a bit more down to earth as to not incite gamer rage.


For more on the best gaming PCs, check out the best CPU for gaming, the best computer speakers, and the best gaming PCs in the UK.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.