Apple brings new accessibility features on iPad and iPhone, designed to meet a wide range of user needs. These include the ability to control your device with eye-tracking technology, create custom shortcuts using your voice, experience music with a haptic engine, and more. The company revealed the announcements ahead of World Accessibility Awareness Day on Thursday.
Apple already supported eye-tracking in iOS and iPadOS, but it required the use of additional eye-tracking devices. This is the first time Apple has introduced the ability to control iPads and iPhones without the need for additional hardware or accessories. The new built-in eye tracking option allows users to use the front-facing camera to navigate apps. It leverages artificial intelligence to understand what the user is looking at and what gesture they want to perform, such as swiping and tapping. There’s also Dwell Control, a feature that can sense when a person’s gaze rests on an item, indicating they want to select it.
“Vocal Shortcuts,” another useful new feature, enhances Apple’s voice controls. It allows people to assign different sounds or words to initiate shortcuts and complete tasks. For example, Siri will launch an app even after the user says something as simple as “Ah!” The company also developed “Listen for Atypical Speech,” which uses machine learning to recognize unique speech patterns and is designed for users with conditions that affect speech, such as cerebral palsy, amyotrophic lateral sclerosis (ALS), and stroke episode, among others.
Other speech enhancements Apple has made in the past include “Personal Voice,” which was launched last year to give users an automated voice that sounds just like them.
For people who are deaf or hard of hearing, “Music Haptics” is a new feature that lets users experience the millions of songs on Apple Music through a series of taps, textures and vibrations. It will also be available as an API, so music app developers can soon provide users with a new and accessible way to experience audio.
Apple also announced a new feature to help with motion sickness in cars. Instead of looking at static content, which can cause nausea, users can turn on the “Vehicle Motion Motion” setting. This feature places animated dots at the edges of the screen that oscillate and move in the direction of movement.
CarPlay also gets an update, including Voice Control. “Color Filters”, which gives colorblind users bolder and larger text. and “Sound Recognition” to alert deaf or hard of hearing users when car horns and sirens are present.
Apple also revealed an accessibility feature coming to visionOS that will enable live captions during FaceTime calls.