Apple’s Visual Intelligence is a built-in take on Google Lens

Estimated read time 1 min read


Apple announced a new feature called Visual Intelligence that will be part of the company’s Apple Intelligence suite of AI features “later this year.” The feature works much like similar features offered by other multimodal AI systems from Google or OpenAI.

Visual Intelligence lets you “instantly learn about everything you see,” Apple’s Craig Federighi said during the company’s September event today. Federighi said the feature is “enabled by Camera Control,” which is the company’s name for a new capacitive camera button that’s now on the side of the iPhone 16 and 16 Pro phones. To trigger it, users will need to click and hold the button, then point the phone’s camera at whatever they’re curious about.

The feature can search Google or direct an image to ChatGPT to do things like identify a dog breed or get a restaurant’s hours, just by pointing the phone at either. Apple didn’t announce when the feature would debut beyond that it’s coming this year.

Developing… check out our live blog for the latest details. 



Source link

You May Also Like

More From Author

+ There are no comments

Add yours