Meta's Orion AR glasses may still be in the prototype stage, but they've already managed to pull off something few other mixed reality headgear devices have been able to do. This is the first time in a long time that I've slipped on this kind of device and didn't start the mental countdown on how long it would be before I could take it off.
Spatial computing headsets may be the future, but if so, it's a future full of devices that I do not like to wear. As a rule I find headsets uncomfortable — even the few that aren't too heavy on my head make me sweat around the strap on the back and the viewfinder up front. I also don't care for the sensation of feeling cut off from the world around me, even on visors with pass-through capabilities. AR and VR glasses certainly don't feel as heavy, but they can still feel hot the longer I wear them, and they tend to slide down my nose at inopportune times. It's just not a fun experience no matter the device.
@tomsguide ♬ vlog, chill out, calm daily life(1370843) - SUNNY HOOD STUDIO
But the Meta Orion glasses didn't pose these problems at all during a recent hands-on demo I experienced at Meta headquarters in Menlo Park, California. The prototype I wore felt fairly lightweight and certainly didn't heat up as my 20-minute demo went on. I could always see and hear the world around me, without my peripheral vision getting cut off. Even better, the glasses didn't slide down my face, keeping the AR visuals right in front of me at all times.
That's encouraging since Meta would be the first to tell you that the version of the Orion glasses that I wore isn't quite ready for prime time. The glasses themselves are still pretty thick — not Buddy Holly-thick like the Snap Spectacles AR glasses I tried out last month, but still bulky enough to make people think you'd been cast in a "Revenge of the Nerds" reboot if you wore them in public. Among other improvements, Meta wants to slim down the form factor before its AR glasses launch commercially.
"For all of this to work well, it really has to be socially acceptable," said Ming Hua, Meta's vice president of wearable device. "You feel comfortable wearing the glasses all day long, with your friends and also when you're going out and about."
That's a ways down the road, though not that far off as you might think. (Meta has merely said that a shipping version will be ready in "the near future.") And while the company works on perfecting the design of the glasses, improving the display and bringing the cost down to what you'd spend on a high-end smartphone, Meta can at least take comfort in knowing that it's already accomplished a much tricker task — it's actually come up with some compelling uses cases for AR glasses.
Meet Orion
The world got its first look at the Orion AR glasses during an introduction at September's Meta Connect conference. What we saw at that time was the culmination of five years of work in a which a series of ridiculous oversized prototypes gradually got shrunken down into the current Orion setup — a pair of glasses that connects wirelessly to a small puck and a wristband that's used to help navigate through the AR interface. (More on those controls in a bit.)
Instead of glass, Meta uses silicon carbide for Orion's lenses. The lightweight material reduces optical artifacts and also provides a high refractive index, which Meta says is necessary for a wide field of view. The Orion glasses boast around a 70-degree field of view, which felt a lot less cramped than other mixed reality eye glasses that I've used where images often get cut off.
For instance, when I had a video chat via the Messenger app on my Orion glasses, I could see a full-screen view of the person I was conversing with, as easily as if the chat were taking place on a phone or computer screen. The big difference is that with the glasses, the conversation took place right in front of me, making it feel more immersive than your typical video chat on a flat display.
There's another nice effect to the silicon carbide lenses. While people looking at me might see some bluish-purple streaking on the lenses from the AR images appearing before me, they can still see my eyes, instead of an opaque screen — or worse, a digital recreation of my eyeballs. It should allow more natural interactions with anyone sporting a pair of Orion glasses.
The glass frames are made out magnesium — a material that's rigid as well as lightweight. Hua tells me the rigidity is importnat to prevent any misalignment between the two display engines on the glasses. Magnesium also helps with heat dissipation, which is one of the things that helps the glasses feel to comfortable on my head even after prolonged periods.
And that's important because there's a lot of components hidden in the frames of those glasses. Besides uLED projectors — they're very small and power efficient — you've got multiple custom chips and seven cameras embedded in the frame.
"We roughly need to reduce the power consumption for each of those operations to, like, a tenth of what's in the form," Hua said.
The processors on borad the glasses are handling things like simultaneous location and mapping, eye tracking, hand tracking and AR world-locking graphics algorithms, while the elongated puck uses dual processors to take care of apps. The puck also manages low-latency graphics rendering AI tasks.
The glasses and puck connect wirelessly, and they don't need to be right next to each other. Meta says you can slip the puck into a backpack as you're using the glasses without having to worry about a loss of connectivity. During my demos, I never needed to carry around the puck as I was using the glasses.
At this stage the puck has enough battery power to get through a day, while the glasses are good for three to four hours of use, depending on what kind of activities you're engaging in.
Controlling the Meta Orion glasses
There's no input device for Orion other than the gaze of your eyes and the fingers on your hand, though that wristband I mentioned earlier is there to help with controls. It's an electromyography, or EMG, wristband and it senses the electrical signals of your muscle movements, beaming that gesture to the glasses.
"When you're making gestures, your brain is sending electric signals to the hand," Hua said. "So we use sensors... to capture the voltage change when you're making gestures."
The idea behind the wristband is that you can keep your hand at your side, making subtle gestures to control the glasses instead of waving your hands in your field of view and attracting the stares of curious onlookers. Did that stop me from raising my hand into my field of view and making those gestures anyhow? It did not, but perhaps with more practice I'd get used to keeping my hand out of sight while working the controls.
The wristband fit snugly but comfortably on my arm, and I really took no notice of it while using the Orion glasses. That's significant since I hate having anything strapped to my wrist — I won't even wear a smartwatch for this reason — so the fact that I could don Meta's EMG band without a whimper of complaint implies that the company's done a pretty good job at making it feel light and natural.
The gesture controls are pretty natural, too. You pinch with your index finger and thumb to select things, with your eyes acting as a sort of cursor, as the glasses detect what button your looking at. A middle finger/thumb pinch takes you to the app launcher, and repeating that gesture hides the same control. Make a fist and flick your thumb forward and back when you want to scroll through something like an Instagram Reel. There was a brief tutorial at the start of my demo session to acquaint me with those controls, but the fact that Meta has kept things so simple makes it easy to retain what it is your supposed to be doing to find your way around Orion's menus.
It's ideal that the controls require nothing more than hand gestures and a steady gaze, as the big appeal of the Orion glasses is the ability to use them hands free. That's what struck me during a cooking demo meant to showcase the image recognition features of the glasses by having Orion identify different ingredients and whip up something incorporating those same ingredients.
That's all well and good, but as someone who does a lot of cooking and has to refer back to recipes from time to time, I appreciate having the instructions floating in front of my vision, while I use my hands to scroll forward and scroll back. That can be problematic if you're spatchcocking a chicken, for example, and you've got to then touch your iPad screen to advance to the next step. With Orion, you won't have to wash off those chicken-covered hands — just flick your thumb forward, and keep cooking.
Augmented worlds you can share
If there's something I dislike about headsets and glasses almost as much as their comfort level — or lack thereof — it's how they cut me off from the rest of the world. Just as the Orion glasses proved to be surprisingly comfortable, they also raise the possibility of a more collaborative experience with AR.
One of Meta's demos allowed me to play pong with another Orion glass-wearing participant. We stood a few feet apart, sending a virtual ball ricocheting back and forth by moving our paddles up and down or left and right. It was certainly a fun way to pass the time, but it also illustrates how those augmented images appearing in front of you don't have to be for just you alone.
To be fair, the Snap Spectacles also featured a collaborative demo involving finger painting when I tried out those AR glasses prior to my Orion test drive. But I don't think it's unfair to Snap to say that the fifth generation of its smart glasses aren't as far along, graphics-wise, as Orion is at this point. And the wider field of view for the Orion glasses makes collaboration and cooperation a little bit easier.
This fits in with Meta's overall hope for Orion, as it wants its AR glasses to be the successor to the smartphone as the device we use to interact with the world around us. "We're hoping to make it so that with glasses, a lot of what you're doing today with your phone, like checking messages, notifications, making a phone call, can be more seamless and hands free," Hua said.
Certainly, that makes sense on some levels. Use a phone, and your gaze is locked on a screen, limiting your ability to be a part of what's happening around you. AR glasses let things unfold in front of you — and Meta argues that its approach would let you still see your surroundings and remain a part of conversations and interaction with the world around you.
But not every demo I saw made me ready to trade in my phone for a pair of souped-up specs. To show off Orion's multitasking capabilities where I was watching Instagram Reels when a message came in. I switched over the Messages app and fielded a video call, with all three panels appearing relatively clearly in front of me.
The demo's supposed to showcase not only the wide field of view on the Orion glasses, but also how you can use the glasses to multitask. To me, however, it felt a built overwhelming. I get the same feeling when Apple shows off all the floating workspaces that can hover around your head when you're wearing a Vision Pro headset. Maybe some people find that convenient, but to me it's just a reminder of my ever expanding to-do list, only floating directly in my face. Less of that, please.
Meta Orion AR Glasses: What's next
It'll be some time before Orion glasses are ready to be worn out in the field by civilians like you and me. Meta is showing off the glasses at this point so that the company's devices team can get feedback from both Meta employees and external partners on what features to develop and which functions to leave on the cutting room floor. I'd also wager that app makers are getting a chance to build AR versions of their apps optimized for Orion so that there will be plenty of options ready to run once the glasses do hit the market.
I don't know how long it will Meta to improve the Orion displays and shrink down the form factor of the current version, but judging by the progress that the company has made evolving from prototype to prototype, it may not be all that long. As recently as 2019, Meta's stab at holographic AR glasses featured a backback and a headset that looked like you were about to perform some heavy welding.
Meta's desire to get the glasses to the same price level as a high-end phone may be a tougher roadblock. Assuming we're talking about conventional phones, that would be in the $1,199 to $1,299 range of the iPhone 16 Pro Max and Galaxy S24 Ultra; your top foldable phones are in the neighborhood of $1,799 to $1,899. I haven't priced components for smart glasses lately, but given the kind of tech Meta is packing into the Orion specs, I imagine it's going to be a challenge getting to that range.
But — and I never thought I'd be saying about any kind of AR product — I hope Meta gets there. The Orion glasses in their current form hold a lot of promise — not just for fit, but for functionality as well.