Illustration for article titled Apple's Mixed Reality Headset Might Forgo Controllers for Eye Tracking and Iris Recognition

Photo: Ryan Anson/AFP (Getty Images)

There’s no lack of speculation around Apple’s rumored AR headset, but the latest is the most sci-fi of them all. According to reliable Apple analyst Ming-chi Kuo, it’s possible the headset will eschew handheld controllers in favor of eye tracking and iris recognition.

Advertisement

Per AppleInsider, Kuo’s investor note on the topic says the headset will use a “specialized transmitter” to track eye movement and blinking. The way Kuo says the transmitter works is that it emits “wavelengths of invisible light”, which then get reflected off your eyeball. A receiver then picks up that reflected light and the changes in light patterns are then analyzed to determine where you’re looking.

That data could then be used to better customize a user’s interaction within an AR environment. Another benefit is that could allow people to control menus by blinking or, perhaps even learn more about an object if they stare at it for a certain period of time. It could also enable better processing power, as anything in your peripheral vision could have a reduced screen resolution.

Where this gets kicked up another notch is iris recognition. While Kuo isn’t sure this is a bonafide feature, he says the “hardware specifications suggest the HMD’s [head mounted display] eye-tracking system can support this function.” Iris recognition is big, as we’ve all seen spy movies where it’s used as a form of biometric identification. This could potentially enable an additional layer of security, making sure no one else can use your device—because these devices will not be cheap. In a more everyday sense, it could also be used for services like Apple Pay.

One of the biggest problems with mixed reality and virtual reality is there’s no great way to interact with what you’re seeing. Enterprise headsets like Microsoft’s HoloLens 2 and the Google Glass Enterprise Edition 2, as well as previous consumer versions like Focals by North, all relied on some iteration of hand controls or finger loops. They work, but calibration is an issue and the process can be clunky. Eye tracking, if done well, is a potential gamechanger as you don’t have to keep track of another accessory, or memorize a set of controls.

This interface problem is well known among companies trying to create AR gadgets for consumers. Apple isn’t the only company looking for a novel solution. Facebook recently revealed that it’s envisioning wrist-based wearables that could let you control AR with your mind. It’s far too early to tell which of these two methods (or potentially one we haven’t even heard of yet) will win out in the end. Previously, Kuo noted that Apple’s mixed reality headset is likely to come in 2022, with smart glasses coming in 2025. Facebook is expected to launch some kind of smart glasses this year, but it’s likely the futuristic methods it’s described are for later down the line. That said, I will definitely take eye-tracking over haptic socks any day.

Advertisement

Relationship & social life. Direct hire fdh.