Apple has announced a groundbreaking feature that will revolutionize how iPhone and iPad users interact with their devices: eye tracking. This innovative tool harnesses the power of artificial intelligence (AI) to enable users to control their Apple devices using only their eyes. Unlike science fiction, this technology is set to become a reality for millions of users worldwide.
The eye tracking functionality utilizes the front-facing camera of the device to set up and calibrate within seconds. Powered by on-device machine learning, all data used for setup and control remains securely stored on the device and is not shared with Apple. This feature is part of a suite of new accessibility features introduced by Apple, reaffirming their commitment to inclusive design.
Apple CEO Tim Cook emphasized the company’s dedication to innovation and accessibility, highlighting their continuous efforts to enrich the lives of all users. The Eye Tracking feature will seamlessly integrate with both iPadOS and iOS apps, requiring no additional hardware or accessories. Users can navigate through app elements and activate functions such as physical buttons, swipes, and gestures solely with their eyes, using a feature called Dwell Control.
This highly anticipated feature is scheduled to launch later this year and has already generated significant attention on social media platforms like X (formerly Twitter). While many applaud its potential to assist those with disabilities, some humorously speculate about the impact on societal laziness, drawing comparisons to episodes of the show “Black Mirror.” Nonetheless, the Eye Tracking feature represents a significant step forward in accessibility and technology.
Alongside Eye Tracking, Apple introduced other notable features aimed at enhancing user experiences, such as Vehicle Motion Cues to reduce motion sickness and Music Haptics for those who are deaf or hard of hearing. These advancements underscore Apple’s commitment to leveraging technology for the benefit of all users.