WWDC might be a month away, but Apple has revealed a bunch of new accessibility features ahead of the developer conference – and one in particular has taken the internet by storm. No longer confined to the realm of Vision Pro, eye tracking has come to iPad and iPhone, letting users control the devices using their eyes.
Likely to be released as part of iOS 18 which we'll see debuted at WWDC, the new features use Apple silicon and AI to "further Apple’s decades-long commitment to designing products for everyone."
Without the need for any additional software, eye tracking uses the front-facing camera to set up and calibrate in seconds. According to Apple, The tool lets users navigate through the elements of an app and use Dwell Control to "activate each element, accessing additional functions such as physical buttons, swipes, and other gestures solely with their eyes."
Jokes and funsies aside, this is a game changer and a legit humane thing I am not kidding when I say that this is the most apple thing aside from their behaviour in other stuff just more humane and useful technology in good use https://t.co/toLz6DX24jMay 15, 2024
Sometimes the accessibility features are the most coolest/impressive. A lot of us aren’t in the target audience for these features but I know many people, who don’t need these features, will still try or even use them anyways 😂 https://t.co/6hcCVEJsjvMay 15, 2024
Along with eye tracking, Apple also revealed Music Haptics, which plays taps, textures, and refined vibrations to the audio of music to help users with hearing impairments enjoy songs. And Vehicle Motion Cues can help reduce motion sickness, with animated dots on the edges of the screen representing changes in vehicle motion "to help reduce sensory conflict without interfering with the main content."
It's curious that Apple opted to launch these features via a press release rather than waiting for WWDC, but it could mean there's a whole host of super impressive updates on the way. And we'd bet money on them involving AI.