February 5, 2020
The latest (high-end) enterprise VR headsets have something in common: The integration of eye tracking. It seems eye tracking is becoming the standard for second-generation virtual reality headsets, and there are a number of compelling applications that should be on your radar.
Gaze Matters
Eye tracking isn’t new; the technology has long been used for research, including marketing and medical research. With recent advances, however, the data is becoming easy and fast to collect and, as a result, it’s becoming easier and faster to train algorithmic models based upon eye tracking data. Eye or gaze tracking measures the direction of your gaze. Combined with head tracking and potentially additional biometric sensors (ex. EEG), you can learn all kinds of things about the user.
In VR, eye tracking has user interface and performance implications, which make the experience more immersive. For one, it can be used as an input method, a way of interacting with content in the virtual world. Gaze is a natural user interface, allowing you to simply look and grab a virtual object and enhancing the social interaction between users in multiuser VR simulations. Input aside, eye tracking improves headset performance, making it possible to reduce GPU load and power consumption, which in turn allows device manufacturers to boost screen resolution and frame rates.
To explain, I have to get technical for a moment: Eye tracking enables foveated rendering, whereby only the portion of the image that the user is looking at (or, more specifically, the user’s fovea, where visual acuity is highest in the retina) is fully rendered. Though the rest of the scene isn’t fully rendered, you can’t tell in your peripheral vision. Essentially, foveated rendering imitates the way we actually see, putting what’s in the center of vision in clearest focus and blurring (reducing the number of pixels) what’s on the periphery. This greatly reduces processing power. In addition, VR headsets with eye tracking can calculate interpupillary distance (IPD), a measurement unique to each person, and then move the lenses into optimal position for the user. The result of automatic IPD adjustment is reduced eye strain and a more comfortable virtual experience.
Eye Tracking Market
One of the better-known eye tracking companies is Tobii, which, in addition to a wearable eye tracker that captures viewing behavior in the real world (and can be combined with other biometric devices for deeper insights), has an eye-tracking SDK and a VR analytics tool. The SDK enables gaze-based selection and can mirror a user’s eye movements in a VR avatar, while the analytics tool collects and analyzes eye data from virtual environments. WorldViz is another company, maker of Vizard VR Eye Tracking, which supports a range of VR headsets, projection systems, and other hardware. Vizard records gaze direction, pupil size, fixation timing and other parameters, and is aimed primarily at researchers. Pupil Labs sells eye tracking glasses (Pupil Invisible) along with binocular accessories that add eye tracking to leading AR/VR platforms like the Vive, Epson Moverio BT-300 and HoloLens. You might also remember Eyefluence, the eye-tracking startup focused on controlling devices with the human eye that was acquired by Google back in 2016.
Headsets with EyeTracking
There are several devices marketed as eye tracking headsets, including FOVE (eye control in VR); LooxidVR, a research-oriented, mobile-powered headset with eye tracking and EEG; and StarVR, an enterprise headset with a human FOV that mimics peripheral vision for more lifelike and immersive VR scenarios (think flight sim, heavy machinery training, etc.) Eye tracking is now becoming as ubiquitous as head and hand tracking in the most popular and talked-about XR devices. Interestingly, this group does not include the Oculus Rift S.
HTC VIVE Pro Eye
Price: $1,599 for a full kit; $999 for headset only
The business-oriented HTC VIVE Pro Eye has built-in Tobii eye tracking, which means you can implement gaze-oriented menu navigation, making hand controllers optional. You can see where a user is looking (gaze origin/direction) as well as for how long (heatmapping) and collect data on pupil position, pupil size and eye openness. Of the benefits of eye tracking in VR, HTC’s website mentions improving training simulations, enhancing collaboration (by reflecting eye movements/blinks in avatars), and gaining insights into user performance, interaction and intent.
Varjo VR-1
Price: €9,995 for the XR-1 Developer Edition
Varjo’s human-eye, professional-grade headset has both integrated eye tracking and hand tracking by Ultraleap. The company says its 20/20 Eye Tracker is the most advanced stereo eye tracking tech integrated into a VR headset, with fast and accurate calibration and the ability to collect precise eye data for research, training, industrial design, and more. Varjo says the tech can be used with glasses or contacts, and it’s private—the data belongs to you, to share with third parties of your choice.
Varjo recommends its HMDs for research in VR/MR environments that would be “too costly, impractical or impossible” to create in the real world. You can track users’ eyes as they explore a virtual scene, capturing behavioral insights as they interact with objects and stimuli.
Pico Neo 2 Eye
Estimated shipping by March 31; for businesses only
At CES 2020, Pico Interactive announced its latest 4K VR headset, the Neo 2, an all-in-one device with six degrees of freedom (inside-out 6DoF) plus head and controller tracking and support for wireless content streaming from a VR-ready PC. À la HTC, there’s also a second version of the headset, the Neo 2 Eye, with built-in eye tracking from Tobii (enabling foveated rendering).
Nreal Light
Price: $1199 to preorder the Nreal Light Developer Kit; $499 for the consumer kit expected to become available in early 2020
These lightweight mixed reality glasses could be a hit with consumers once they go on sale, but they’re also for business (nreal exhibited at EWTS 2019). Nreal recently unveiled Nebula, an Android-based operating system for the glasses and announced a partnership with 7invensun to add eye tracking to the device. This will enable real-time gaze controls.
Of note: NextMind
This Paris-based brain computer interface (BCI) startup debuted a $400 neural interface developer kit at CES last month. The company advocates the use of EEG data in concert with eye tracking to get a more complete picture of what a user really intends.
Applications of Eye-tracking VR Headsets
Academic researchers already use real-time or recorded eye movement data to assess attention, compare group behavior, and more. We could write a lot about scientific VR, but eye tracking opens a number of possibilities in VR for enterprise, such as iris recognition for increased security and more realistic social avatars (see applications below).
Training & Performance
Not only does eye tracking make for more immersive and realistic VR training simulations, it also enables real-time performance feedback and gaze-based triggers. For high-precision training – think flight simulations, surgical training and complex industrial training – the natural interaction and visual fidelity possible with eye tracking are key, on top of the inherent safety and repetition of VR training in general. Consider the nuclear power industry: There are about 450 nuclear power reactors in the world, all of which operate by strict procedures. Training is hard and physical simulators expensive, especially for emergency scenarios, tasks you might only perform once a year (or even less frequently), and areas of the plant that are inaccessible for much of the time due to factors like radiation. This is why Fortum, a Finnish energy company, has been building a dedicated, Varjo headset-equipped training room for control room operations.
Trainees can read virtual procedure manuals and control room displays as they would in the real world, and with eye tracking their performance can be assessed. Trainers can follow operators’ eyes as they locate and read the correct manual in VR, know whether they’re checking the right value on a panel, and more. With attention tracking (including blinking, pupil dilation and other eye movements) plus additional biosensors (skin, sweat, heart rate, etc.), proficiency testing becomes possible. Determine how alert and engaged the user is, how he or she performs under stress, and gain insight into the design of the VR simulation itself.
Consumer Research & Marketing
In an age of ad blockers and bots, traditional advertising metrics like impressions and clicks are unreliable. So, how do you measure the effectiveness of a campaign? Attention. Marketers are interested in attention metrics, data about gaze point and duration of view correlated with other information like timing and demographics.
There’s actually a lot of research that goes into product placement, even in ordinary supermarkets. Traditionally, this involves in-store/online surveys and testing in mock stores, which geographically and logistically limit the depth of research. It’s hard to get unprompted behavioral data from a survey, but with (mobile) VR you can research faster, cheaper and at a larger scale, quickly testing different scenarios and collecting rich data you’d otherwise miss. The addition of eye tracking turns consumer behavior into quantitative objective data and it’s revolutionizing how brands do research.
Exemplifying this in a great case study at EWTS 2019, Accenture, Qualcomm and Cognitive3D talked about working with Kellogg’s to determine shelf placement of a new product using VR headsets and eye tracking. As the team attested, the combination allowed them to map consumer behavior to specific products, looking through shoppers’ eyes as they perused the shelves and chose items. Expectations were overturned: Eye-level placement had been considered prime real estate and higher up where consumers expected to find new products, but it turns out that placing Kellogg’s new product on a lower shelf directed attention to surrounding products, increasing the brand’s total sales by 18%.
Eye tracking also came up in other event sessions: GlaxoSmithKline has been looking at extended reality and eye tracking to conduct large-scale AB testing of promotional displays and store layouts. Lockheed Martin’s Shelley Peterson expressed interest in using the technology plus EEG to test employees and accurately determine intent: Is the worker certain in her action or is she guessing? Volvo, too, is using VR and eye tracking for user experience (UX) studies (ex. testing a new interior); and ZeroLight partnered with BMW to use the VIVE Pro Eye in a virtual experience in which customers could configure, customize and explore a car (also UX).
Security, Safety and Communication
Eye tracking in VR training can provide insights into proficiency and task confidence, both important to safety. As mentioned briefly, it might also add a layer of authentication and security: Retinal scanning, for example, to turn on a VR (or mixed reality) device and “eyes only” access to information. Moving away from VR for a minute, incorporating eye tracking in augmented reality glasses could help detect drowsy, not fully alert workers before they get into an accident. In turn, the data could have implications for improving VR training sims (more effective training =safer employees).
In 2017, Tobii performed a study at a metal foundry, where the work requires intense concentration and focus. Workers pouring liquid metal into casting molds cannot afford a break in concentration, so wearable eye trackers were used to study the process up close via the eyes of experienced workers. The attention data helped identify behaviors intuitive to the best workers, which informed new training guidelines at the foundry. You can also imagine how eye tracking might be incorporated into VR training for the foundry’s melt department, indicating proficiency by comparing the user’s attention metrics to those of the best workers from the study. Occasional refreshers on adherence to proper procedures were also prescribed for experienced workers following the study, as familiarity with the process could sometimes reduce concentration.
Social VR
Finally, eye tracking enables nonverbal communication in social VR experiences, which in enterprise include virtual collaboration and meeting spaces. An estimated 80% of human communication is not about what we say but rather what we don’t say: Facial expressions, body language, gestures and eye contact can speaker louder than words, indicating someone’s intent, confidence and more. This – the lack of nonverbal cues in VR – is why it’s still more meaningful to meet face-to-face as opposed to avatar-to-avatar in a virtual environment. Eye and body tracking change this, enabling more lifelike nonverbal communication and interactions in virtual collaboration scenarios. By mimicking the user’s eye movements in real time, avatars become more expressive – more akin to the real people they represent – and users gain the ability to read others’ faces and emotions—a key business skill.
Conclusion
Eye tracking in VR is unlocking capabilities we’ve never had before in enterprise, taking use cases to the next level and producing the data innovators need to validate their business case. We’re learning that it’s not all about where you look but for how long and in what order, the associated behaviors you exhibit, and additional biometric data streams uncovering even richer insights into customers and employees alike. What additional applications are there and, perhaps more importantly, what are the privacy implications? Just how sensitive is my eye tracking data and who does it belong to?