CUPERTINO, CALIFORNIA- Powered by artificial intelligence, Eye Tracking gives users a built-in option for navigating iPad and iPhone with just their eyes.
Designed for users with physical disabilities, Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple.
“We believe deeply in the transformative power of innovation to enrich lives,” said Tim Cook, Apple’s CEO.
“That’s why for nearly 40 years, Apple has championed inclusive design by embedding accessibility at the core of our hardware and software. We’re continuously pushing the boundaries of technology, and these new features reflect our long-standing commitment to delivering the best possible experience to all of our users.”
Eye Tracking works across iPadOS and iOS apps, and doesn’t require additional hardware or accessories.
With Eye Tracking, users can navigate through the elements of an app and use Dwell Control to activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes.
This and other new accessibility features are designed to cater to a diverse range of user needs.