Apple’s new accessibility features let you control an iPhone or iPad with your eyes

Estimated read time 3 min read


Apple just announced a slew of new accessibility features coming to its software platforms in the months ahead, including eye tracking, which the company says uses artificial intelligence to let people with physical disabilities more easily navigate through iOS and iPadOS.

A new “music haptics” option will use the iPhone’s Taptic Engine vibration system to “play taps, textures, and refined vibrations to the audio of the music” for supported Apple Music tracks. Apple is also adding features to reduce motion sickness for those susceptible to it when using an iPhone in a moving vehicle.

All of these new accessibility options are likely to debut in iOS and iPadOS 18, though Apple is only saying “later this year” ahead of its WWDC event next month. The eye tracking feature “uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.” The company says it’s been designed to work across iOS and iPadOS apps without requiring any extra hardware or accessories.

Music haptics will let those who are deaf or hard of hearing “experience music on iPhone” by producing a range of vibrations, taps, and other effects in rhythm with millions of tracks on Apple Music. Apple says developers will also be able to add the feature to their own apps through a new API.

These animated dots could help some people avoid sensory conflict and thus reduce motion sickness.
GIF: Apple

Other upcoming accessibility features include vocal shortcuts, which will let anyone “assign custom utterances that Siri can understand to launch shortcuts and complete complex tasks.” A new “Listen for Atypical Speech” feature uses machine learning to recognize someone’s unique speech patterns; this one is “designed for users with acquired or progressive conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), or stroke.”

If you’re someone who often encounters motion sickness when using your tech in a moving vehicle, Apple’s got a new method for helping to reduce those unpleasant feelings:

With vehicle motion cues, animated dots on the edges of the screen represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content. Using sensors built into iPhone and iPad, vehicle motion cues recognizes when a user is in a moving vehicle and responds accordingly. The feature can be set to show automatically on iPhone, or can be turned on and off in control center.

The company’s full press release contains a longer list of other accessibility capabilities that are coming to Apple’s platforms in a few months. AI and machine learning appear throughout the text, offering yet more confirmation that iOS 18, iPadOS 18, and the company’s other software platforms will go heavy on AI-powered features. Apple is reportedly in discussions with both OpenAI and Google about collaborating on some generative AI functionality.

But even outside all that, these are great steps for making Apple’s products more accessible to as many people as possible. The company announced them one day before Global Accessibility Awareness Day, which is on May 16th.



Source link

You May Also Like

More From Author

+ There are no comments

Add yours