Newsroom

iPhone will introduce Google Lens-like ‘visual intelligence’ feature to iPhone 16

iPhone will introduce Google Lens-like ‘visual intelligence’ feature to iPhone 16

iPhone will introduce Google Lens-like ‘visual intelligence’ feature to iPhone 16

The tool will be accessible via Apple's new Camera Control button

The tool will be accessible via Apple's new Camera Control button

The tool will be accessible via Apple's new Camera Control button

A hand is holding a beautiful, black iPhone with three large camera lenses on the back, featuring the Apple logo prominently in the center.
A hand is holding a beautiful, black iPhone with three large camera lenses on the back, featuring the Apple logo prominently in the center.
A hand is holding a beautiful, black iPhone with three large camera lenses on the back, featuring the Apple logo prominently in the center.

Highlights:

  • The iPhone 16 and 16 Plus will feature a Camera Control button, launching Apple’s ‘visual intelligence’ for advanced reverse image searches and text recognition.

  • The feature integrates third-party search models like ChatGPT allowing users to obtain more results.

Get smarter at marketing in just 5 minutes

Our 1x weekly, bite-sized newsletter will give you everything you need to know in the world of marketing:

Apple is set to introduce a new feature called ‘visual intelligence’ on its upcoming iPhone 16 models. The feature will allow users to perform visual searches and text recognition from their devices. 

At an Apple event, Craig Federighi, Apple’s Senior Vice President of Software Engineering, said with ‘visual intelligence,’ users can “instantly learn about everything,” they see.

Apple’s ‘visual intelligence’ tool looks like Google Lens, which launched in 2017. Google Lens provides search results and information when users point their phone's camera at any object. 

What is Apple’s ‘visual intelligence’ and how will it be accessible?

The ‘visual intelligence’ feature is a tool that brings advanced image-based search capabilities to iPhone 16 users. According to Apple, this feature allows users to carry out visual searches that can analyze objects, landmarks, text, and more using the new Camera Control button.

To activate the feature, users can press and hold the button while aiming their phone’s camera at any object. Federighi said it can capture “details like title, date, and location” on event flyers.  

Integration with ChatGPT for advanced queries

Apple's ‘visual intelligence’ uses search tools like ChatGPT to deliver search results. This integration will allow users to send their visual queries directly to ChatGPT for more complex searches. Federighi explained that the feature is “also your gateway to third-party” models. 

Privacy and data protection

Apple has emphasized that ‘visual intelligence’ will protect user privacy by not storing images processed by the feature on its servers. User data will be private and secure. 

When will Apple roll out its ‘visual intelligence’ feature?  

Apple says that the ‘visual intelligence’ feature will be rolled out in October 2024 for users in the United States, with plans to expand to other regions by December.  iPhone 16 users worldwide will soon have the opportunity to use this tool.  

09/10/2024

📰

Stories like this, in your inbox every Wednesday

Our 1x weekly, bite-sized newsletter will give you everything you need to know in the world of marketing:

Subscribe

Paperboy brand

The Keyword

© Copyright 2024, All Rights Reserved

© Copyright 2024, All Rights Reserved