iPhone 15 Pro and iPhone 15 Pro Max – Expected to Receive AI-Powered Visual Intelligence Features Soon

Published On:
iPhone 15 Pro Max

Apple may soon bring visual intelligence features to the iPhone 15 Pro and iPhone 15 Pro Max, according to a new leak. Currently, these features are exclusive to the iPhone 16 series, despite the iPhone 15 Pro models supporting on-device AI processing. The decision to expand support reportedly came after Apple found a way to integrate these AI capabilities into the newly launched iPhone 16e, even without the Camera Control button.

Visual Intelligence Features

John Gruber, the co-creator of the Markdown markup language, recently claimed in a blog post that Apple could roll out visual intelligence features to the iPhone 15 Pro models as soon as April. Citing unnamed Apple representatives, he stated that users will be able to activate the feature via the Action Button. The update is expected to arrive with iOS 18.4.

Visual intelligence, as part of Apple Intelligence, relies on computer vision to perform various tasks. On the iPhone 16 series, users can long-press the Camera Control button to access the camera’s viewfinder and perform functions such as:

  • Identifying businesses
  • Translating text in real-time
  • Summarizing written content
  • Reading handwritten text aloud
  • Recognizing plants, animals, and objects

Previously, this feature was exclusive to the iPhone 16 series, as the only way to access it was through the Camera Control button. However, with the introduction of the iPhone 16e, Apple enabled visual intelligence through the Control Center and allowed users to bind the feature to the Action Button. Now, it seems Apple is extending the same functionality to the iPhone 15 Pro models.

Expanding Apple Intelligence

When Apple rolled out its Apple Intelligence suite with iOS 18.2, visual intelligence was the only feature not made available to last year’s Pro models. Many assumed this was a deliberate move to differentiate the iPhone 15 Pro series from the newer iPhone 16 lineup. However, Gruber suggests that Apple was simply waiting for the iPhone 16e launch before enabling the feature on older models.

Interestingly, Apple has also integrated visual intelligence with OpenAI’s ChatGPT. Users can access the chatbot to ask questions about objects detected by the camera, further enhancing the functionality of visual intelligence.

AI-Powered iPhone

With the rumored iOS 18.4 update, iPhone 15 Pro and 15 Pro Max users may soon experience Apple’s visual intelligence features, bridging the gap between the previous and current flagship models. As Apple continues to refine its AI-powered tools, this expansion could be a sign that more Apple Intelligence features might eventually make their way to older devices in future updates.

FAQs

When will iPhone 15 Pro get visual intelligence?

The feature is expected to arrive with iOS 18.4 in April.

What is visual intelligence in Apple Intelligence?

It uses computer vision to recognize text, objects, businesses, and more.

How will iPhone 15 Pro users access visual intelligence?

They can bind the feature to the Action Button or use the Control Center.

Was visual intelligence exclusive to iPhone 16?

Initially, yes, as it was only accessible via the Camera Control button.

Is visual intelligence linked to ChatGPT?

Yes, users can ask ChatGPT questions about detected objects.

Leave a Comment