Unless you've been living under a rock, you may have noticed a trend in PC gaming in recent years—and that trend is upscaling. While AMD's compute-based FSR has improved over its various iterations, as we recently found in our testing, it's still not a patch on Nvidia's AI-powered DLSS for image preservation. And according to Nvidia CEO Jensen Huang, we're now at a point where AI has become all-but-essential for next-generation graphics tech.
Speaking at the Goldman Sachs Communacopia + Technology Conference in San Francisco this week, Huang spoke about a variety of topics, including the benefits of AI in a variety of industries. When asked which of these use cases he was most excited about, he said:
"Well, in our company, we use it for computer graphics. We can't do computer graphics anymore without artificial intelligence. We compute one pixel, we infer the other 32. I mean, it's incredible.
"And so we hallucinate, if you will, the other 32, and it looks temporally stable, it looks photorealistic, and the image quality is incredible, the performance is incredible"
"Can't" is an interesting term to use. While you can of course create computer graphics without any AI interference, what Huang is likely referring to here is the top-end of graphics technology, which has become increasingly hardware demanding—and thereby increasingly reliant on hardware-based AI upscaling like DLSS to run at acceptable frame rates.
"Computing one pixel takes a lot of energy. That's computation. Inferencing the other 32 takes very little energy, and you can do it incredibly fast. So one of the takeaways there isn't just about training the model...it's about using the model"
Reducing the load on a GPU by inferring pixels and creating extra frames via the use of AI, à la DLSS Frame Generation, has certainly changed gaming since its introduction. AMD's competing solution, FSR, is still a compute-based upscaling approach—although there has been speculation that RDNA4 may have an AI hardware based method to rival Nvidia and Intel's solutions.
Looking at the wider picture, however, AI of course has more practical use benefits than simply making our games look better, and run faster:
"If not for AI, the work that we're doing in robotics, digital biology...just about every tech bio company I meet these days [is] built on top of Nvidia", he continued.
"Small molecule generation, virtual screening. I mean, just that whole space is going to get reinvented for the very first time with computer-aided drug discovery because of artificial intelligence. So, incredible work being done there."
Best CPU for gaming: The top chips from Intel and AMD.
Best gaming motherboard: The right boards.
Best graphics card: Your perfect pixel-pusher awaits.
Best SSD for gaming: Get into the game ahead of the rest.
While it's easy to criticize many aspects of the AI boom—from the massive power requirements of the vast amounts of hardware used to train it, to AI search suggestions gone awry—when it comes to our beloved computer graphics, it's now difficult to imagine games without AI or AI-based upscaling in some form or fashion.
And as for things like biotech? Well, by the looks of what Jen-Hsun is saying here, it may play a key role in the next generation of healthcare, too.
Continued graphical improvements, and better medical technologies? Perhaps this whole AI shebang has some real-world benefits after all. Now, if we could only make it more accurate instead of better at hallucinating, that'd be dandy.