A Month with Eye-Tracking Technology: My Experience

Eye Tracking on Smartphones: A Look at the Past, Present, and Future
Smartphones, while incredibly powerful, have become somewhat homogenous in their design and interaction methods. We used to see a wider variety of phone designs, unique navigation systems, and interesting materials. But a nearly forgotten technology might change that: eye tracking. This article explores the history and current state of eye-tracking technology in smartphones, focusing on its potential and limitations.
A Brief History of Eye Tracking on Phones
The concept of eye tracking is far from new; methods for tracking eye movement were being developed as far back as 1879. However, its application to smartphones is relatively recent.
One of the earliest examples of eye tracking in a smartphone was the Samsung Galaxy S4. This phone used a basic system that allowed users to scroll through content by tilting their heads and following the text with their eyes. Looking down toward the bottom of the screen would scroll the content upward and vice versa. This was a rudimentary implementation but demonstrated the concept’s potential.
Other manufacturers experimented with front-facing cameras for gesture-based control (like LG’s Air Motion and Google’s Motion Sense on the Pixel 4), offering alternative interaction methods. However, Honor has recently taken a bolder step, integrating eye tracking more centrally into its user interface.
Honor Magic Series: Eye-Gaze Control as a Core Feature
Honor has integrated eye tracking technology into its Magic OS, making it a more significant part of the user experience than previous attempts. The system allows users to interact with notifications and, potentially, other aspects of the phone’s interface using only their eyes.
The process involves a simple calibration step where the user follows points on the screen to teach the system their gaze patterns. Once calibrated, a user can expand a notification by looking at it; maintaining their gaze for a short period opens the associated app.
While the full potential of Honor’s system is still in development, the ability to control notifications with eye gaze is already available in Magic OS 9. This capability is particularly useful in situations like driving, where hands-free operation is crucial. Imagine glancing at a notification, then having it automatically expand to reveal its contents—all without taking your hands off the wheel.
Practical Applications and Usefulness
Eye-tracking technology on smartphones currently has some niche but powerful applications. For instance, it’s extremely beneficial when users want to minimally interact with their phone. One can quickly glance at and expand a notification without taking their hands off the wheel or their eyes off their work. This helps maintain workflow and focus.
The accuracy of existing eye-tracking systems is impressive, even at a slight distance from the phone. The potential for future development, encompassing scrolling, app opening and closing, and more comprehensive navigation commands with eye movements, is exciting.
Eye Tracking on iPhones and Android Phones: Accessibility Features
It’s important to note that basic eye-tracking capabilities already exist on many smartphones, though they’re typically relegated to accessibility features. iPhones, starting with the iPhone XS, included eye tracking within their accessibility settings. Similarly, most Android devices offer Switch Access, an accessibility feature that utilizes eye tracking for navigation. While some devices have this built-in, others, such as some Samsung Galaxy phones, may require installing the Switch Access app.
Enabling these features usually requires navigating through several accessibility menus. Once enabled, the system uses the phone’s camera to track your eyes, allowing for basic actions, normally via gaze-based dwell time to control the on-screen pointer. After a few seconds of focused gaze on a particular screen item, it will register as a ‘tap’. This differs from the more seamless integration offered by Honor in that it requires explicit activation via multiple menu taps or button holds, thus taking away the effortless hands-free experience. It’s important to understand that the intended purpose of these native features is primarily to assist users with disabilities, not to provide a fully hands-free general user experience.
Enabling Eye Tracking on iPhones:
- Go to Settings > Accessibility > Eye Tracking, then turn on Eye Tracking.
- Follow the onscreen instructions to calibrate Eye Tracking.
- An onscreen pointer will follow your eye movements. An outline appears around the item you are looking at.
- Hold your gaze steady to activate a tap action.
Enabling Eye Tracking on Android Phones:
- Go to Settings > Accessibility > Switch Access (install the app if needed on Samsung devices).
- Grant the necessary permissions.
- In “Camera Switch Settings,” turn on “Use Camera Switches.”
- Now you can use various head and eye gestures (looking left, raising eyebrows, etc.).
Key Differences and Potential
The primary difference between Honor’s approach and that of Apple and Google’s lies in the integration. Native eye-tracking features on iPhones and Android phones usually live within the accessibility settings; they prioritize assistance for users with disabilities. Conversely, Honor is promoting eye tracking as a potential core feature for all users – an intuitive, hands-free interaction style.
Many believe that eye tracking has significantly more potential than voice control. Voice commands can be inefficient and disruptive, especially in busy environments and after a long day. Eye tracking, on the other hand, could offer a seamless, natural alternative to touch interactions, saving time and eliminating smudges on the screen.
Conclusion
Eye tracking is a technology that is advancing rapidly. While its current applications are limited, its potential is huge. Honor’s implementation shows a more compelling vision of how this technology could be used daily, potentially revolutionizing how we interact with our smartphones. Whether it’s a "gimmick" or a "game-changer" remains to be seen, but the future of eye tracking on mobile devices looks promising. What are your thoughts on this technology and its possible impact?