Apple Announces New Accessibility Features: Eye Tracking, Music Haptics, and More

Important updates

  • Eye tracking: Control your iPad or iPhone with your eyes.
  • Music Haptics: Experience music through vibrations.
  • Vocal Shortcuts – Perform tasks with custom sounds.
  • Vehicle Motion Signals: Reduces motion sickness in vehicles.
  • Accessibility features for CarPlay and visionOS: Improved accessibility on all devices.

Apple has announced new accessibility features launching later this year that are designed to improve the user experience for people with various disabilities. These features include Eye Tracking, which allows users to control iPads and iPhones with their eyes, making navigation easier for those with physical disabilities. This technology uses the front camera and machine learning on the device to ensure privacy and security. Additionally, for deaf and hard of hearing music lovers, Music Haptics will allow them to experience music through the iPhone’s Taptic Engine, providing haptic feedback synchronized with the audio.

The image below shows a person using eye tracking on their iPad.

The image below has concentric rectangles animated in a loop indicating haptic feedback synchronized to the music.

Another major update is vocal shortcuts, which allow users to assign custom sounds to Siri commands, making it easier to complete tasks through unique vocalizations. This feature, along with Listen for Atypical Speech, improves speech recognition for people with conditions such as cerebral palsy or ALS, improving device control for a wider range of speech patterns. Apple is also introducing Vehicle Motion Cues, which helps reduce motion sickness for passengers using iPhones or iPads in moving vehicles by displaying animated dots that align with the movement of the vehicle.

The image below shows animated dots that move to the left or right of the Apple device when the vehicle turns left or right.

CarPlay and visionOS will also receive accessibility updates. CarPlay will feature voice control, color filters and sound recognition, helping deaf and hard-of-hearing users and those with color blindness. VisionOS will include live captions for better communication during live conversations and app audio, along with support for Made for iPhone hearing devices. These improvements reflect Apple’s continued commitment to inclusive design, leveraging advanced technology to make its products more accessible to everyone.

These new accessibility features continue to underscore Apple’s dedication to inclusive design, demonstrating a commitment to leveraging advanced technology to improve the lives of all users. Announced in conjunction with World Accessibility Awareness Day, these updates highlight Apple’s ongoing efforts to make accessibility a priority. By integrating innovative solutions such as Eye Tracking, Music Haptics and Vocal Shortcuts, Apple continues to break new ground in creating products that meet the diverse needs of its global user base. This focus on accessibility not only enriches the user experience but also sets a benchmark for the industry, emphasizing the importance of designing technology that is truly for everyone.

Fountain: Apple

chatGPT, a potential tool for greater accessibility, was used as a research and writing aid for this blog post. Do you think this is an appropriate use of chatGPT? Why or why not? Let me know!

We will be happy to hear your thoughts

Leave a reply

Register New Account
Compare items
  • Total (0)
Shopping cart