Visual Intelligence: How to look up info with iPhone AI tool

The new Visual Intelligence feature in iOS 18.2 provides a quick way to find information just by pointing the iPhone 16’s camera at an object in the real world. Then you can ask ChatGPT to explain what you’re looking at, do a reverse image search to find products and look things up visually, get information on a business as you walk down the street, and quickly add events to your calendar.

Here’s how it works.

How to use Visual Intelligence on iPhone 16

While Visual Intelligence relies heavily on third parties like ChatGPT and Google, its integration into the iPhone 16 makes it a powerful tool for everyday use. Apple’s new AI tool puts those incredibly useful information sources directly at your fingertips.

However, you will need the right combo of hardware and software. Apple built Visual Intelligence exclusively for iPhone 16 owners, because the AI feature uses the new Camera Control button only available on those phones. And, as mentioned, you’ll need to be running iOS 18.2 or better.

Table of contents: How to use Visual Intelligence on the iPhone 16

  1. Install iOS 18.2 and enable Apple Intelligence
  2. Click and hold the Camera Control
  3. Look things up by snapping a picture
  4. Other smart context-dependent features
  5. How does Visual Intelligence work?
  6. More Apple Intelligence features

Install iOS 18.2 and enable Apple Intelligence

The first Apple Intelligence features became available in iOS 18.1. Visual Intelligence is part of the second round of Apple’s AI features, currently in beta testing in iOS 18.2. You can (and should!) install the iOS 18.2 public beta by going to Settings > General > Software Update > Beta Updates. After you update, head to Settings > Apple Intelligence & Siri to enable Apple Intelligence.

Click and hold the Camera Control

You launch Visual Intelligence by clicking and holding the Camera Control, a gesture that mimics clicking and holding the side button to activate Siri. Because the Camera Control is only available on the iPhone 16, the feature is only available on those phones. Apple could likely enable a Control Center button to use the feature on iPhone 15 Pro models, but has not chosen to enable that.

After clicking and holding the Camera Control, you’ll see an brief animation and an interface that looks similar to the iPhone camera, but with fewer controls.

Look things up by snapping a picture

Looking up information on a Macintosh using Visual Intelligence
Look up information on stuff around you with Visual Intelligence.
Screenshot: D. Griffin Jones/Cult of Mac

Click the Camera Control again, or tap one of the on-screen buttons, to look up whatever’s in view. The following two options are always available:

  • Ask will send the picture to ChatGPT. OpenAI’s chatbot might be able to explain what you’re looking at, and you can ask it follow-up questions for more information. Trying this out with a bunch of weird objects around my office, I came away pretty impressed by what ChatGPT got right, but of course, I caught a few mistakes. You can’t entirely trust ChatGPT as your sole source of information; you should always fact-check for something important.
  • Search uses a Google reverse-image search to identify the object. This proves useful if you want to find a product or object online. Tap on a result to open the link in Safari (or your preferred web browser).

Other smart context-dependent Visual Intelligence features

Looking up a restaurant with Visual Intelligence
Find the menu for a restaurant you’re passing by with a tap.
Screenshot: D. Griffin Jones/Cult of Mac

Visual Intelligence’s other smart features are more context-dependent:

  • Events: You can point your iPhone 16 camera at something with event information on it, like a poster or document, and quickly add the event to your calendar. If it’s a music festival or concert, the tool might match it to an event and fill in details. My testing for this so far yielded spotty results, but I haven’t really been able to test it thoroughly. It could become super-handy.
  • Businesses: Take a picture of a restaurant, and Visual Intelligence will reference your location with Apple Maps information to look up the restaurant or business you’re looking at. You can see a phone number, website, menu and more. This seems like it could prove incredibly useful. If you’re walking down the street deciding where to eat, you might be able to quickly get information without manually searching for every name you see.

How does Visual Intelligence work?

Technically speaking, none of these Visual Intelligence features are super-smart. ChatGPT and Google Reverse Image Search are both third-party services. Pulling information out of the iPhone camera like events and phone numbers is based on Live Text, a feature the iPhone has had for years. And Visual Intelligence does not use AI to do much when identifying businesses; it’s mostly using the iPhone’s GPS and compass with Apple Maps — any iPhone can do that.

However, in practice, they’re still useful, practical features. I think if Apple Intelligence wasn’t a big marketing and branding push, Apple would be more upfront about this being a convenient repackaging of different third-party services, rather than framing it as a new AI technology.

More Apple Intelligence features

Source

      Guidantech
      Logo
      Shopping cart