

Apple continues to innovate with every new iPhone, and although we have to wait a year for the next device, the tech giant keeps things ticking along with regular iOS updates.
There was plenty of backlash toward the release of iOS 18, especially with the disappointing and slow rollout of Apple Intelligence. With over 1.56 billion iPhone users around the world, Apple has a captive market.
Still, it sometimes feels that things are moving so fast, we sometimes miss out on some of these important innovations. iOS 18.5 was branded as a 'nonsense' update that didn't deserve its .5 namesake, and while every update tends to come with the same critiques about being underwhelming and draining our batteries, at least Apple is trying to do something.
In particular, the iPhone is packed with important accessibility features that tend to fly under the radar.
Advert
Apple is already changing things up with the revamped release of iOS 26 and the upcoming Liquid Glass update, but heading back to iOS 18, it seems many of you have missed out on a futuristic tracking feature. That's right, you can control your iPhone with your eyes.
Over on Reddit, u/No-Cartoonist-9838 was shocked to uncover the 'Eye Tracking feature'.
Announced in March 2024, Eye Tracking is an accessibility feature that is designed for those with physical disabilities, allowing them to navigate their iPhone or iPad using just their eyes. Of course, you don't need to have sight issues with sight to try it out.
Advert
Apple itself writes: "Eye Tracking uses the front-facing camera to set up and calibrate in seconds, and with on-device machine learning, all data used to set up and control this feature is kept securely on device, and isn’t shared with Apple."
Those who've stumped up for the pricey Vision Pro will know about Eye Tracking, although the iPhone version apparently isn't as reliable. Responding to the OP, one person said: "It‘s hard because neither your head nor your phone is in a fixed position. That‘s why it‘s sometimes off, but works so great on the Vision Pro."
Advert
Others were similarly unimpressed and warned that it's not quite the Minority Report future you might hope it is.
Another added: "Tried it on my 15 pro and it’s still pretty sh*t. It works for about 2 minutes but if you move your head or the phone really at all it starts having issues. Also doesn’t really work with glasses."
Also discovering Eye Tracking for the first time, someone else explained: "Just tried it out. It sucks. Big time. iPhone 13. Perhaps it’s better on phones with the most recent FaceTime camera module. Not really surprised. It sounds like a difficult problem to solve outside of a controlled environment (like the Vision Pro)."
Another defended the feature and concluded: "It's made to be used in a controlled environment, like mounted to a wheelchair for people without mobility to touch their phones/iPad, thus it being an accessibility option, not a parlor trick."
Advert
Other accessibility features you might not know about include music accessibility with the Taptic Engine, Vocal Shortcuts, and Vehicle Motion Cues. iOS 26 looks to be even smarter, improving braille features, Personal Voice, and the addition of Accessibility Reader.