DMR News

Advancing Digital Conversations

Apple Introduces Eye-Tracking in iPhones and iPads

ByHilary Ong

May 18, 2024

Apple Introduces Eye-Tracking in iPhones and iPads

Apple has announced the introduction of eye-tracking capabilities to recent iPhone and iPad models, enhancing accessibility features across its devices. This update arrives just in time for Global Accessibility Awareness Day, demonstrating Apple’s ongoing commitment to making its technology more accessible to users with disabilities.

The new features, which include built-in eye-tracking, customizable vocal shortcuts, and music haptics, aim to improve user interaction without the need for additional hardware, leveraging the latest in on-device AI technology.

Eye-Tracking Technology in iOS and iPadOS

The newly integrated eye-tracking technology allows users of newer iPhone and iPad models (equipped with at least the A12 chip) to navigate their devices using only their gaze. Utilizing the front-facing camera, users can interact with apps and menus by looking at elements and selecting them through a feature Apple has dubbed “Dwell Control.”

This feature, which has been part of Apple’s accessibility settings on other devices like Macs, facilitates interaction without physical touch, relying on brief pauses on screen items to trigger actions. The eye-tracking functionality is integrated directly into the operating systems, making it compatible with third-party applications from its launch.

Enhancements to Vocal Interactions

In addition to eye-tracking, Apple is refining its voice control capabilities. Users can now set up customized vocal shortcuts for hands-free control over their devices. These shortcuts can be triggered by specific words, phrases, or even distinct sounds, which the device recognizes and processes through Siri without the need for preliminary commands.

Furthermore, Apple has introduced “Listen for Atypical Speech,” a tool designed to adapt voice recognition to unique speech patterns, enhancing accessibility for users with speech impairments.

Music Haptics and Vehicle Assistance

For users who are deaf or hard of hearing, Apple is introducing haptic feedback for music playback, starting with its Apple Music app. This feature will allow users to experience the music through vibrations that accompany the audio, adding a tactile dimension to the listening experience.

Additionally, Apple has made improvements to CarPlay, incorporating voice control, color filters, and enhanced text visibility. A new feature, “Vehicle Motion Cues,” has been designed to help users who experience motion sickness by providing visual cues that align with the vehicle’s movements, potentially reducing discomfort.

Collaborations and Further Innovations

These advancements have been developed in collaboration with the Speech Accessibility Project at the Beckman Institute for Advanced Science and Technology at the University of Illinois Urbana-Champaign. Apple’s work in this area is part of a broader effort that includes partnerships with other technology leaders like Google and Amazon to push forward accessibility enhancements across different platforms and devices.

Apple has not specified the exact release dates for these features but has indicated that they are expected to be rolled out in the next updates of iOS and iPadOS. With the developer conference, WWDC, on the horizon, it is anticipated that more details and possibly the release of these tools will coincide with the event.


Related News:


Featured Image courtesy of Apple

Hilary Ong

Hello, from one tech geek to another. Not your beloved TechCrunch writer, but a writer with an avid interest in the fast-paced tech scenes and all the latest tech mojo. I bring with me a unique take towards tech with a honed applied psychology perspective to make tech news digestible. In other words, I deliver tech news that is easy to read.

Leave a Reply

Your email address will not be published. Required fields are marked *