Apple announced a platter of new accessibility features that will make their way to its devices later this year. For starters, the company is readying a feature called Eye Tracking for iPhone and iPad that people can use to navigate their devices "with just their eyes."
Eye Tracking doubles up the front-facing camera as eye-tracking hardware and leverages on-device machine learning to get the job done. In other words, it doesn"t need specialized hardware to work across iPadOS and iOS apps. You can jump through different elements of an app and use Dwell Control to activate each element, and access functions such as physical buttons, swipes, and more.
Apple said its Music Haptics feature is meant for people who are deaf or hard of hearing to experience music on iPhone. The accessibility feature will command the iPhone"s Taptic Engine to play taps, textures, and refined vibrations in sync with the songs on Apple Music. The feature will support millions of songs on Apple Music and will be available to developers as an API.
Apple is also working to reduce motion sickness for people using iPhones or iPads in a moving vehicle. Vehicle Motions Cues will use the device"s built-in sensors to know if a user is in a moving vehicle and display animated dots on the edges of the screen. The dots will change their direction as the vehicle accelerates, brakes, or takes a turn.
Explaining the reason behind motion sickness, Apple said:
Research shows that motion sickness is commonly caused by a sensory conflict between what a person sees and what they feel, which can prevent some users from comfortably using iPhone or iPad while riding in a moving vehicle. With Vehicle Motion Cues, animated dots on the edges of the screen represent changes in vehicle motion to help reduce sensory conflict without interfering with the main content.
Furthermore, users can assign custom sounds and utterances to launch shortcuts and perform complex tasks through a feature called Vocal Shortcuts. Another feature called Atypical Speech uses on-device machine learning to enhance speech recognition for a wider range of speech and recognize speech patterns.
What"s on the table for Apple Vision Pro and CarPlay?
The Cupertino giant is also readying some accessibility features for the Vison Pro headset and CarPlay. Vision Pro will get support for system-wide Live Captions to help users "follow along with spoken dialogue in live conversations and in audio from apps."
Users can move the captions using the window bar during Apple Immersive Video. Vision Pro will also support MFi (Made For iPhone) hearing accessories and vision accessibility features such as Reduce Transparency, Smart Invert, and Dim Flashing.
Meanwhile, CarPlay will get support for three new accessibility features. Voice Control lets users control the CarPlay interface using their voice, and Sound Recognition displays visual alerts to notify users about car horns and sirens. Color Filters is designed for colorblind users to make the CarPlay interface visually easier to use.
Apple also announced updates to existing accessibility features, such as new voices for VoiceOver, Hover Typing, Personal Voice in Mandarin, and a new Reader Mode in Magnifier. You can read the company"s blog post to learn about these updates.