Get all your news in one place.
100’s of premium titles.
One app.
Start reading
TechRadar
TechRadar
John-Anthony Disotto

I tried iPhone 16's Visual Intelligence and here's how you can too with iOS 18.2

Visual Intelligence on iOS 18.2 developer beta on an iPhone 16 Pro Max.

I have been waiting to try Visual Intelligence since Apple first unveiled the iPhone 16 in September. After all, there’s a new button (Camera Control) on my iPhone, and I haven’t been using it for photography.

Camera Control hasn't clicked with me after a few months with the iPhone 16 Pro Max. I take a decent amount of photos with my iPhone, but any time I try to use the dedicated button, it feels cumbersome and confusing – so much so that I’ve resorted back to my trusty touchscreen and Lock Screen shortcut.

Enter iOS 18.2 which is now installed on my iPhone 16 Pro Max, and so far, I’m a big fan at the prospect of what Visual Intelligence can become.

What is Visual Intelligence? I hear you ask. Well, it’s an Apple Intelligence feature exclusive to the iPhone 16 lineup and takes full advantage of Camera Control. You launch it by long-pressing Camera Control and then snap a photo of whatever you’re looking at. From there, you can ask ChatGPT for information, search Google, or highlight any text in the photo. Think of Visual Intelligence as Apple’s version of Google Lens with its own hardware button to access on the fly.

My first impressions of Visual Intelligence

(Image credit: Apple)

You can launch Visual Intelligence from anywhere, even the Lock Screen, which makes it incredibly useful whenever you want to do a quick search. My first test was taking a picture of my Game Boy Camera on my desk. As mentioned above, visual Intelligence gives you a few options, so I first used Google Search to find the product. Then, I asked ChatGPT for information, and it was able to tell me all about the Game Boy Camera’s history. From there, you can ask follow-up questions, so I asked, “When did the Game Boy Camera launch in Europe?” ChatGPT obliged with the correct answer.

While it’s still technically in beta, despite the official iOS 18.2 launch, Visual Intelligence worked a treat with a recognizable product like the Game Boy Camera - I’m not sure how often I’d use it to search for an item, but considering it’s just a simple long press away, it might become my go-to way of searching the web for things.

Another great use for Visual Intelligence is when you’re out and about and want to see information about a shop, cafe, bar, or restaurant. I tested it with a local coffee shop, and while it didn’t work like Apple showed off in its demo, I think that’s more down to the early beta version I’m testing than the feature itself.

In that demo, Apple showed that Visual Intelligence could determine a dog breed. I tried this with my French Bulldog, and while I could search Google for similar dogs, it couldn’t give me a straight-up answer.

That kind of sums up Visual Intelligence in its current form. It has huge potential: I love the way it gives Camera Control a genuine purpose, and when it works, it’s fantastic. But it’s still cooking in the oven, and sometimes it fails to pick up on what I'm asking for (for example, the text summary option sometimes appears, and other times it doesn't).

One thing is for sure, however: Visual Intelligence makes total sense to me now, and I finally understand why Apple added Camera Control to the new iPhones. It’s the kind of Apple Intelligence feature that I can see people turning to when they need a quick answer, as long as it works smoothly, and the ChatGPT and Google integration make it multi-faceted.

I love testing new iOS features, and iOS 18.2 might just be the most exciting update we've seen in years. After using the software for a month or so and with access to Genmoji and Image Playground, I can confidently say that iOS 18.2 feels like the iOS 18 and Apple Intelligence we've been waiting for.

Visual Intelligence has a lot to offer, and I’m incredibly excited to see where Apple takes this Apple Intelligence feature over the next year. Exclusive to the best iPhones, this could be the reason to buy an iPhone 16 - who would’ve thought it could be Camera Control?

How to use Visual Intelligence

1. Hold down on Camera Control

To launch Visual Intelligence, simply hold down on Camera Control on the right side of your iPhone 16

Take a photo

With Visual Intelligence now on your screen, point your camera and take a photo of whatever you want to ask about.

(Image credit: Future / Apple)

Ask or Search

Now choose whether to "Ask" ChatGPT or "Search" Google for information on the image.

Get answers

Now you'll see Visual Intelligence work its magic and give you a detailed answer on whatever you took a photo of. In this example, ChatGPT explains that the image is a logo of soccer club FC Union Berlin.

From there, you can ask follow up questions by simply typing in the ChatGPT box.

You might also like...

Sign up to read this article
Read news from 100’s of titles, curated specifically for you.
Already a member? Sign in here
Related Stories
Top stories on inkl right now
One subscription that gives you access to news from hundreds of sites
Already a member? Sign in here
Our Picks
Fourteen days free
Download the app
One app. One membership.
100+ trusted global sources.