When the Dreamcast came out in the Fall of 1999, most people were enamored with Soul Caliber's mind-blowing visuals and the fact that Sega eclipsed EA's long-running Madden series with a single release. But while I certainly enjoyed hours of Soul Caliber, I spent hundreds of hours breeding and evolving Chaos, cute little virtual pets cleverly hidden in Sonic Adventure's Chao Garden. And 25 years later, virtual pets are making a major comeback in new and interesting ways.
In his weekly column, Android Central Senior Content Producer Nick Sutrich delves into all things VR, from new hardware to new games, upcoming technologies, and so much more.
The experience was made even better by the fact that I could put my Chaos on the Dreamcast's Visual Memory Unit—VMU for short—and take them along with me for the day. I spent more than a few hours earning points to further evolve my pets during many of my 8th grade classes, and later that year, Sega would release Seaman—one of the weirdest virtual pets of all time—that you could talk to (and would talk back to you) via a controller-mounted microphone. They're still some of my favorite gaming memories of all time.
Fast forward more than two decades and Sega might be a shell of its former self, but virtual pets are just as popular as they were back in the days of the Tamagotchi. The folks behind the ever-popular Pokemon Go, Niantic, released Peridot last year and recently came out with a VR tie-in called Hello Dot on the Meta Quest, letting you play with your virtual pets in a palpable new way.
And some of the folks behind Job Simulator and Vacation Simulator are bringing modern LLM-based AI engines to NPCs and even virtual pets. That means the era of getting Leonard Nimoy to record hundreds or thousands of voice lines—as Sega did with Seaman—is no longer necessary. Virtual pets are back, and I couldn't be happier about it.
Give me a pet
My renewed interest in virtual pets began when my then-8-year-old son got a Tamagotchi for Christmas. A year later, Neko Atsume Kitty Collector made its mixed reality debut right after the Meta Quest 3 came out, and it had me dreaming of taking virtual pets on the go throughout the day and then coming home to play with them in VR.
That's where Peridot and Hello Dot come in. I reached out to Asim Ahmed, the global product marketing leader at Niantic, to find out more about the company's goals for Peridot and Hello Dot and whether the company was planning to further integrate both games into a cohesive virtual pet ecosystem.
I was elated with what I found out.
While most of Niantic's games are focused on getting people to "get outside more and explore the world around them," as Ahmed put it, Peridot does things a little differently. There's still that classic Niantic formula that's been around since the Ingress days where you're encouraged to walk around in the real world and discover new virtual creatures and items, but Peridot blends the real and virtual worlds in a way the company hasn't pushed since the original Pokemon Go days: through your smartphone's camera.
Peridot uses your phone's spatial understanding of the real world to make it look like your virtual pet is walking alongside you but is only viewable through the magic window of your phone. When you're out and about, Peridot encourages you to take out your phone and let you play with your pet no matter where you're at, and new places might reveal new items or new evolutions you haven't seen before.
Imagine, then, what's going to happen with this when Google and Magic Leap launch their first pair of Android XR-based smart glasses, and you no longer have to hold a device to see your pet? You'd have them with you all day instead of only when you remember to pull out your phone and launch the app.
In a way, Rec Room's My Little Monsters update already does this, but in a roundabout way. Rec Room is one of the most popular multiplayer VR games available, but it helps boost popularity by allowing non-VR players to join in on the fun. That means you can grow and evolve your pet on the go and then play around with them in VR when you get home. But it's a little more involved than what I'm hoping a true Chao Garden successor will bring.
For now, Hello Dot is probably the closest thing I can get, but its early state is a bit of a barebones experience that's mostly fun to play for a few minutes at a time. It's the same basic gameplay loop as Peridot but offers a far more immersive way to interact with them. Instead of just tapping your phone's screen to pet them, you can actually reach down and pat their head, stroke their chin, or even pick them up for a cuddle.
Until Hello Dot and Neko Atsume, few games gave you the ability to do this. Most of the time, that restriction was simply because the pet was trapped inside some kind of electronic device. In a way, this is still the same restriction—the difference is that it feels more real to your brain because you can naturally move and interact with it as if it were real.
Ironically, given Niantic's penchant for getting folks outdoors, Hello Dot and Peridot don't have a good way to let you take your pet on the go—not yet, anyway. Ahmed says Niantic has a full roadmap for Hello Dot, hoping to build the game's future together with the mobile version, Peridot, creating new ways to interact with virtual pets not seen before.
The next evolution
While virtual pets almost always "evolve" to the next level instead of growing like normal real-life pets, the way we interact with virtual pets hasn't evolved since the 1990s. Tamagotchi let you care for your pet on the go, while games like Creatures on the PC gave players a far more detailed world to play around in. But we've always been stuck clicking or tapping to interact with them.
That's all about to change with the advent of companies like AstroBeam, which are using the power of LLM-based AI engines so that in-game NPCs—short for non-player characters—feel more like interacting with real humans or creatures.
The future of NPCs is coming! Introducing our multiplayer LLM powered NPCs. Multiple people can chat and interact with the same NPC in VR, in real-time! pic.twitter.com/A3dpxGRNoQAugust 22, 2024
Using LLM-based AI models to power NPC interactions isn't something extraordinarily new—Skyrim players have been using ChatGPT to make smarter NPCs for over a year now—but AstroBeam aims to take this concept and use it to make multiplayer games feel fuller.
I spoke with Devin Reimer, the founder of AstroBeam, prior CEO of Owlchemy Labs, and one of the brains behind Job Simulator, to find out more about how the paradigm of NPCs is about to change.
As Reimer puts it, the current way of implementing ChatGPT-style AI models into NPCs in existing games largely feels awkward and "really horrible." A big part of this is that, while it's cool to hear an NPC respond in what feels like a strangely intelligent human way, most implementations lack one key expectation: actions.
That's because all of these implementations are mods of existing games rather than games designed from the ground up with AI implementations. Reimer describes the work his company is doing as similar in concept to how Job Simulator was created.
"Early in VR, the first stuff that is tried is 'let's take what we currently have and jam them together'...but it turned out that almost everything that was successful was built from the ground up for VR."
Talking to our phones or computers has yet to feel like talking to a person, but Reimer seems to believe that this can change with realistic reactions, movements, and contextual actions and responses—particularly when presented in VR. He points out how the robot in the video example above not only looks at the person it's talking to but also uses hand gestures and other expressions to make it feel "right."
When I think about the hours I spent on the Dreamcast version of Seaman, one of the most impressive (and disturbing) things about it was the emotional responses the fish-men delivered while talking to them. Of course, the game didn't really "understand" what I was saying in a human way; it was mostly looking for keywords to respond to in order to fake a human-like response.
What's better at that sort of thing than modern AI? Nothing to date, of course! LLMs like Google Gemini have proven to be able to understand not only what we say to it but also drawings and things it can see through cameras. Imagine then how much better the concept of a virtual pet could be if it could actually understand you.
I don't need my pet to talk back to me like the fish do in Seaman, but how cool would it be if it could understand more than basic commands and, instead, evolve to understand language to the fullest extent? That's the real next generation of virtual pets, without a doubt.