Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tom’s Hardware
Tom’s Hardware
Technology
Jowi Morales

Nvidia's first AI NPC debuts in Mecha BREAK — Minitron 4B model only requires 2GB of VRAM

Player talking to AI NPC on Mecha BREAK.

Nvidia’s AI-powered NPC engine debuted in Mecha BREAK, a multiplayer mech-based combat game. The technology behind this AI NPC is Nvidia’s Avatar Cloud Engine (ACE) for games, which launched in COMPUTEX 2023. ACE is a custom AI model that allows developers to make intelligent NPCs (non-playable characters) with which players can interact naturally.

What’s more interesting with the Mecha BREAK demo is that the AI processing is done locally if you have any RTX GPU — this includes models from the entry-level RTX 2060 to the currently most powerful RTX 4090 and beyond. To accommodate this vast number of graphics cards, the ACE model used in Mecha BREAK uses the Nvidia Minitron 4B model, which only requires 2GB of VRAM. The 4B model is based on the Nemotron, which requires four A100 GPUs. This makes the Minitron 4B extremely lightweight compared to the original model, which most likely needs at least 160GB of VRAM.

Although Mecha BREAK is the first game to use Nvidia ACE technology, it’s more of a proof of concept as it’s only applied to a single NPC and is not used across the entire game. Furthermore, it’s less powerful and robust than the cloud-based AI model that uses the Nemotron model. However, given that it’s processed locally, it responds much faster (about 300ms) than the cloud-based AI, which takes a few seconds before it answers.

We’re also unsure how the NPC would work without an Nvidia RTX GPU on your gaming computer. It’s possible that it would use cloud-based ACE if you don’t have a Team Green graphics card on your PC, or it could also have a non-AI version that will take over if the game sees that you have an AMD or Intel GPU (or even a system that solely relies on integrated graphics).

With Nvidia ACE, the AI NPC in Mecha BREAK can understand the player’s voice inputs, meaning they’re no longer limited to a list of preprogrammed responses. Nvidia’s demo video also showed that the AI NPC could understand the players even with different accents, making it accessible to more players from various backgrounds. However, the AI NPC’s responses still feel like those of a typical NPC rather than a real human being. This is likely because it’s using a lightweight model that would work across many devices instead of relying on more robust AI LLMs based on the cloud.

This is the first deployment of Nvidia ACE in the real world, and we’re sure that many players would love to experiment with the AI and see how far they could push its possibilities. The technology will get some traction, especially with how inclusive Nvidia made it available even to older RTX GPUs. Suppose this deployment of Nvidia becomes successful, and players start demanding to see them across more games. In that case, we might see a future where NPCs would begin developing their backstories and personalities through the power of AI.

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.