Get all your news in one place.
100’s of premium titles.
One app.
Start reading
Tom’s Guide
Tom’s Guide
Technology
Alan Martin

Ray-Ban Meta smart glasses just got three big upgrades — here's the new features

Ray-Ban Meta Smart glasses.

We were rather impressed with the 2nd gen Ray-Ban Meta smart glasses when we reviewed them, but thanks to Meta AI they’ve got steadily better over time. Even something as mundane as grocery shopping can be transformed with the AI implementation if you’re wearing them.

But the latest update is the most significant yet. Teased at Meta Connect back in September, three new updates have arrived at the same time, making the glasses all the more useful. However, two of the features are only available to those on the early access program, and all three are currently for those in the U.S. and Canada only.

As detailed on the Meta blog, the three upgrades arrive with the v11 software update. The one that’s available outside of early access is Shazam integration.

If you somehow hear a track you want to hear more of amidst all the Christmas muzak that’s currently on repeat, you can simply say “Hey Meta, what is this song?” and your glasses will use the microphone to listen and come up with the answer for you to stream at your leisure. Again, this is only available in North America, so if you’re elsewhere, you’ll just have to rely on the Android and iOS apps.

Live AI: Meta AI can now 'see'

Then there are the two extras that require you to be a member of the Early Access program. Live AI is the first of these, adding video to Meta AI on your glasses. When activated, Meta AI can now "see" what you’re looking at, and converse naturally about what’s going on before your very eyes.

Meta thinks this will be hugely useful for activities when your hands are busy (think cooking or gardening), or just when out and about. You’ll be able to ask Meta AI questions about what you’re looking at, for example how you can make a meal out of a bunch of ingredients in front of you. There will be battery drain though, with Meta suggesting you’ll get around half an hour of live AI use on a full charge.

Still, it’s an exciting development, and one that Meta teases will improve over time: “Eventually live AI will, at the right moment, give useful suggestions even before you ask,” the post reads.

Live translation

(Image credit: Future)

Finally and, to me, the most exciting, is live translation which promises to let you understand foreign languages without ever attempting to learn them. When enabled, if someone is talking to/at you in French, Italian or Spanish, you’ll get a real-time translation provided for you through the open-ear speakers or as text on your phone.

We’ve seen this kind of thing done before, of course. The first-generation Pixel Buds were attempting this seven years ago, and Samsung’s Galaxy AI does something similar with live phone calls. But this feels a bit more natural than both given Meta’s Ray-Bans are designed to keep the tech largely invisible.

Again, these last two AI features require you to be part of the early-access programme. If you’re in North America, you can sign up here, though it does describe the process as joining a waitlist, implying acceptance isn’t guaranteed.

Still, these will roll out to all users eventually, and Meta has hinted that more will be coming soon. “We’ll be back with more software updates—and maybe some surprises—in 2025,” the post concludes, cryptically.

More from Tom's Guide

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.