Look Away: How Apple Vision Pro’s Eye Tracking Tech Could Be a Hacker’s Dream Come True
- The Apple Vision Pro is Apple’s first headset that supports mixed reality (MR).
- One of the features of the Apple Vision Pro is that it has eye tracking that uses your gaze instead of the cursor.
- GAZEploit uses two biometrics: eye aspect ratio and gaze estimation extracted from personal records.
Apple Vision Pro Vulnerability: GAZEploit Tracks User’s Eye Movements

What is GAZEploit?
The Apple Vision Pro is Apple’s first headset that supports mixed reality (MR). While it is an expensive device, there have been reports of inconveniences, such as not being able to boot up again if you forget the unlock password or the glass cracking without doing anything. However, researchers are reporting that a new attack called GAZEploit has been discovered on the Apple Vision Pro that tracks the user’s eye movements as they type.
One of the features of the Apple Vision Pro is that it has eye tracking that uses your gaze instead of the cursor. For example, you can respond by seeing Yes on the left side of your field of vision and No on the right side, or you can type by just looking at the keyboard displayed in front of you. Input through eye tracking not only improves the user experience, but also has the advantage of making it difficult for people around you or people on video calls to know the contents of your password compared to hand gestures or voice recognition when entering a password.
How Does GAZEploit Work?
GAZEploit uses two biometrics: eye aspect ratio and gaze estimation extracted from personal records. It analyzes the readings to distinguish typing from other Apple Vision Pro-related activities, such as watching videos or playing games. It then maps the extracted gaze movements onto a virtual keyboard to determine potential keystrokes. Studies have shown that while typing, gaze direction tends to be focused and cyclical, and blinking is reduced.
The GAZEploit study was tested on data from 30 individuals and showed a high degree of accuracy in detecting when users typed messages, passwords, URLs, email addresses, and passcodes. The researchers developed an algorithm that calculates gaze stability and sets thresholds for classifying gaze movements, and said that the dataset evaluation showed an accuracy of 85.9% and a recall of 96.8% for identifying keystrokes within a typing session.
Apple’s Response to GAZEploit
Apple acknowledged that GAZEploit was a bug that allowed input to the virtual keyboard to be inferred from Persona, and announced in the VisionOS 1.3 release that the issue was addressed by pausing Persona when the virtual keyboard is activated. In June 2024, researchers also reported the first spatial computing hack in history, where simply visiting a website while using an Apple Vision Pro could cause spiders, bats, and other things to appear in the user’s field of view; Apple quickly released a fix patch this time too.
The report notes that Apple quickly fixed the vulnerability, but that new attack vectors are likely to continue to emerge as VR and AR capabilities expand. The study emphasizes that strong privacy protection measures will become necessary as VR technology becomes more widespread, and that balancing user experience and data protection will be important for widespread adoption as immersive systems increasingly collect rich behavioral data.
