The Apple Vision Pro may have divided opinions since its WWDC 23 announcement, but a recent visionOS beta has just given us an exciting glimpse into a killer feature — Visual Search.
With this, users can look at things in the real world, and interact with them in a similar fashion to what you get in the likes of Apple’s Visual Lookup on iPhones.
Googling the world around you
So how does Visual Search work? Simply put, it’s Google for the world around you. The Vision Pro’s cameras and sensors are able to detect items and text around you, and use that data in apps across visionOS.
That includes copying and pasting printed text from the real world, translating 17 different languages, translating different metrics such as grams to ounces, and much more. This has already been immensely useful in the various AR functions built into smartphones, but to put it into a headset could be a game changer.
I don’t say this without acknowledging that a lot of other companies have done something similar. This is mighty familiar to Google Lens on Project Iris, Snap Scan or Bing’s Visual Search.
But this seems to be the first time we’re seeing it integrated into a headset that’s not a conceptual project. Plus, you can expect it to come with all the fluidity and ease-of-use that Apple is known for.
Outlook
visionOS is currently available through the latest Xcode beta. Beyond some key features that are being discovered in the code, such as the various environments you can experience in full VR mode, Visual Search is one of the more impactful parts of the whole OS.
Of course, there’s the whole giant headset and battery life issue to navigate when it comes to wearing something like the Vision Pro in the real world. But it’s clear Apple is starting to make some big plans for the future of visionOS.