XR & Enterprise Privacy & Security: Is the Metaverse a Giant Data Grab?

Written BY

Emily Friedman

August 12, 2024

With trust in technology companies at an all-time low and seemingly every Big Tech company working on an XR headset to compete with Apple Vision Pro, it’s a good time to consider the privacy and security ramifications of extended reality in enterprise.  

Introduction 

Every new technology introduces new risks to an organization. Any connected device can become a source of real-time data breaches, phishing scams, or invasions of privacy. What makes XR different and potentially scarier are the sophisticated sensors and very nature of the tech: XR devices track highly personal information in order to alter the user’s perception of reality, either by augmenting (adding to) or essentially replacing one’s field of view

Privacy and security experts have expressed concerns about the kinds of company and personal data these devices collect, how that data is used, shared and stored, how to protect IP and personal identity in virtual worlds, how to safeguard augmented spaces, and more. 

Recently, their worst fears proved possible when researchers at the University of Chicago managed to “crack” a Meta Quest headset. Not only did they hijack users’ headsets and steal sensitive information, but they were even able to use generative AI to manipulate social interactions

The Attack

Essentially, hackers found and exploited a security vulnerability in “developer mode” to clone a user’s home screen and control what he/she sees. Once “inside” the headset, they could see and record the user’s voice, gestures, keystrokes, and browsing activity, and manipulate the display. In this way, one attacker saw when a user entered his banking login credentials and changed the amount transferred in an online payment without the user’s realization. 

Moreover, it was possible to change the content of the user’s messages to, for instance, trick others into clicking on suspicious links. And if that doesn’t scare you, the addition of GenAI would allow bad actors to instantly clone your voice and generate deep fakes of you in virtual social spaces

Sensitive Information

Apple Vision Pro contains a whopping 23 sensors, including an array of cameras (four dedicated to eye tracking alone), microphones, and body/motion trackers. But it’s not the number of sensors that’s concerning; it’s what they can reveal about you

XR technology doesn’t work without continuously tracking and gathering vast amounts of personal and especially bodily data. That includes head, hand and eye movements, as well as spatial mapping data about your physical surroundings. Where does this data go and what useful insights can it offer to companies and bad actors?

Compiled over time, bodily and location data from VR can reveal behavioral and psychological information about an individual, including one’s physical and mental state, that can be used to micro-target individuals for commercial gain or worse. 

Considering 95% of decision making happens in the subconscious, eye tracking quite literally gives companies a look inside your mind. Eye movement and pupil dilation together can reveal a user’s emotional reaction to specific content, enabling psychographic customer profiling. GPS and spatial data could be used to track a user’s location and daily behaviors, while the cameras and microphones within XR devices have the ability to recognize faces, voices, your surroundings, and even your emotions.  

Then, there’s the lack of anonymity. According to a 2023 UC Berkeley study, very minimal physical data is needed to reveal your identity–two seconds of hand and head motions could be enough. Movement is, in fact, a unique identifier, revealing characteristics like your dominant hand, height, gender, and more. 

Spatial and Even More Data

While concerns about mass surveillance aren’t new, what happens when the headsets we bring into our homes and businesses can see our physical surroundings? In addition to bodily data, XR devices rely on external-facing cameras and sensors to understand and place virtual content in the user’s environment. It’s called spatial mapping, and it means Meta and third-party apps could know the precise layout of your bedroom or office.

Look around the room; what does your space reveal about you or your work? Is it possible to hack into an XR headset’s camera feed to look around a company’s headquarters? Could spatial data be used to map or recreate secure spaces like a military base? Might companies use students’ spatial data to place virtual ads in school bathrooms? 

And companies plan to collect even more data from XR: In February 2024, Meta notified Quest users of its plans to begin collecting additional data from their headsets, including avatar “lip and face movement” and “abstracted hand and body data.” Apple recently filed patents for Vision Pro sensors that would track hydration, signs of intoxication, breathing rate, and more. Some are worried about the possibility of brain scanning tech in future generations of XR headsets! 

What are the implications of collecting all this “additional data?” How does/will the bodily and spatial data fundamental to XR technology undermine individual and corporate privacy and security?

In sum, the very nature of XR allows a previously unimaginable degree of monitorization, guaranteeing the creation of troves of user data companies will undoubtedly mine and pay for.

On the other hand, eye tracking in VR improves the user experience and can even be used for secure biometric authentication. Alongside AI, eye trackers and other sensors are helping to make XR more immersive and accurate, and give users greater control over virtual content. If the data must be collected, does it have to be stored? If it must be stored, then for how long? How do we balance the good and bad of XR?

Unanticipated Risks, Avatars & Digital Twins

History has shown that user data is often accessed in ways the user didn’t intend. Inevitably, (tech) companies and hackers will find ways to use data that may not seem significant or enlightening today to influence or manipulate people in the future

Consider photorealistic avatars and enterprise digital twins. The cameras and sensors that enable your avatar to mimic your real expressions and mannerisms could theoretically provide everything one would need to pose as you in virtual spaces, even steal your biometric details. Deep fakes in the virtual world are a whole different ball game. 

And while digital twins are hailed as critical tools for optimizing operations, improving sustainability, boosting innovation, and more; they also present significant security risks when connected to or sharing data with real-world (physical) assets or products. Of course, the amount of risk depends on the level of detail and specific application: A mesh model, for instance, is not precise enough to steal a design or reverse engineer a critical part.   Digital twins used for sensitive operational decisions require greater security than those used for, say, rapid prototyping. Organizations will have to maintain a delicate balance between access and security over the lifecycle of every digital twin. 

But even the leak of the real-time behavior or state of a product or system could put an enterprise at risk, revealing insights about equipment performance and other aspects of your operations to competitors. Bad actors could potentially hack digital twins to remotely cripple aspects of your business. That the end user of a digital twin is often not the creator - ex. an operator using the digital twin of a machine created by the manufacturer - and that digital twins may be shared with partners along the value chain adds another layer of risk. 

Current measures and best practices

Apple has detailed efforts to protect Vision Pro user data, including on-device processing of eye tracking data and data minimization (limiting data access to just what’s necessary for an app to function). Gesture control, however, means that any button a user selects within an app while wearing Vision Pro reveals where he or she is looking anyway. Device makers may say they won’t share user data with third parties but there are always caveats and loopholes in the fine print. 

Meta, for its part, is “constantly working” with researchers to expose vulnerabilities, but in general, security research has lagged behind XR product development. The XR industry’s approach to security, as is often the case with new technologies, has largely been reactive. Consider Niantic: The company developed a way for property owners to opt out of participating in Pokémon GO in direct response to lawsuits alleging nuisance and trespass. What other environments or spaces may need to exclude XR in the name of privacy and security? 

Conclusion

We tend to be more willing to compromise on privacy and security in the early stages of new tech, when we’re most excited about what the technology can do. Security, of course, begins with headset manufacturers and security standards built into the devices themselves. XR developers, too, need to employ secure design practices and organizations update policies to account for new risks posed by XR in the workplace. 

That means establishing data governance, implementing multi-factor authentication, granular permission controls, encrypting/obfuscating data, etc. Companies must be transparent about how they’ll use XR-derived user data and, ideally, erase all bodily/spatial data as soon as it’s not needed for a product or app to function. In general, privacy policies should include language to prevent the use of body-based data for user profiling. Governments may pass legislation to improve notice and consent standards, and bar companies from collecting, storing, or sharing our intimate data. 

Finally, user choice is critical. Ideally, users should be able to request access to their data, opt out of sharing unnecessary information with tech companies and employers, and control augmented content in their personal/private space. This will require novel features, flexible controls, and expansive policies accounting for worst-case and dystopian uses of XR, the sooner the better. 

Image source: Vecteezy

Further Reading