Skip to content

Beethoven Would've Loved This: Apple Is About to Launch Music Haptics, Eye Tracking & Vocal Shortcuts

Photo by Zhiyue / Unsplash

Following the release of updated Logic Pro with AI-powered bandmates and in-built stem separation, Apple has announced a slew of new features set to debut later this year, among which are Eye Tracking, Music Haptics, and Vocal Shortcuts; all focused on improving accessibility of Apple's devices and apps.

Songs will become more accessible & immersive with Music Haptics

The feature that caught our attention more than others is Music Haptics, aimed at accessibility improvement in Apple Music, which is a novel way for individuals who are deaf or hard of hearing to experience music on their iPhones. By using the Taptic Engine in iPhone plays taps, this feature translates audio into tactile sensations, so a regular listening to a song on Apple Music becomes a more immersive musical experience for all users.

Credit: Apple

According to the press release, "Music Haptics works across millions of songs in the Apple Music catalog, and will be available as an API for developers to make music more accessible in their apps."

Definitely something Beethoven would've loved!

Eye Tracking to control iPads and iPhones

The Eye Tracking feature will allow users with physical disabilities to control iPad or iPhone through eye movements alone. The feature, like everything today, is powered by AI, working as follows: It uses the front-facing camera of a device to set up and calibrate in seconds. Thanks to on-device machine learning, all the data used is kept securely on device and isn't shared with Apple.

Eye Tracking will enable users to navigate through any app's elements and use Dwell Control to access additional functions, such as physical button, swipes, and other gestures just using their eyes, Apple shares.

According to the press release, no additional accessories or hardware is required to activate the functionality.

Vocal Shortcuts & Listen for Atypical Speech

Vocal Shortcuts and Listen for Atypical Speech further enhance the user experience by offering customisable voice commands and improved speech recognition, particularly beneficial for individuals with conditions affecting speech, including cerebral palsy, amyotrophic lateral sclerosis, or stroke. Vocal Shortcuts specifically allow users to assign "custom utterances" that Siri can understand to launch shortcuts and complete complex tasks.

Listen for Atypical Speech helps users to improve speech recognition with the help of on-device machine learning that recognises user speech patterns.

An additional update, specifically made for users who are blind or have low vision, includes VoiceOver, which will have new voices, a flexible Voice Rotor, custom volume control, and the ability to customise VoiceOver keyboard shortcuts on Mac.

For individuals at risk of losing their ability to speak, Personal Voice will soon be accessible in Mandarin Chinese. Users who find it challenging to pronounce or read complete sentences can generate a Personal Voice using abbreviated phrases.

For those who are nonspeaking, Live Speech will feature categorised options and will be compatible with Live Captions in real-time.