Turn Up the Volume: Sound AR and 3D Audio

Why enterprises using augmented and virtual reality need to listen closely to noise in the workplace and the role of sound in immersive experiences


Noise. Or, in more technical terms, audio.

I was recently gifted a pair of the new Apple AirPods. While I’ve had noise cancelling headphones before, the AirPods Pro offer something new: Active noise cancelling and a transparency mode that uses built-in microphones to allow outside sounds in when I want or need them. Now, I’m not an audiophile and this isn’t an Apple ad, but I am very sensitive to sound when I’m working, particularly the sounds of other people in the office. I’m at my most productive while listening to music but as soon as a colleague takes a phone call, my concentration breaks. (Hell is other people, am I right?)

Source: Apple

I’ve been using the AirPods’ noise cancelling feature every day at work, which got me thinking about noise in the workplace and the role of sound in immersive experiences. This is a topic that came up at EWTS 2019, with multiple speakers touching on hearables as well as haptics and voice. As virtual reality headsets improve and major tech companies develop tools that make it easier to create 3D content, the absence of our other senses in XR becomes more and more obvious.

Big, bad noise

My research took me in a number of directions, but let’s start with Millennials in the office. The idea that Millennials have short attention spans is a stereotype; in fact, younger generations are getting more selective about what they pay attention to.1 Today, most people – not just Millennials – use headphones for their own ‘attention management.’ According to Bitkom Research, nearly 50% of people wear headphones to mute their surroundings while 20% do so to focus on work. Millennials, however, are especially concerned about rising decibels and more likely to wear headphones to drown everything out.2 While the government caps safe noise levels around 90 decibels, Cornell University reports that office workers are most comfortable between 48 and 52 dB. To put that in perspective, casual chatting is around 60 decibels.

Why does any of this matter? In the age of digital transformation, organizations are seeking to augment and upskill the workforce. There are now four generations in the workplace, but when it comes to recruiting and training the next generation of workers, the focus is naturally on Millennials and Gen Z. Traditional notions about how people best collaborate and what makes a productive work environment are proving false for younger age groups. Workstyles are changing: AR/VR training and collaboration are on the rise in industry, with numerous studies and use cases showing the benefits of immersive technologies over traditional teaching methods and work tools. Millennials want quiet spaces and employees are bringing their own noise-cancelling earbuds to the office because sound can, it turns out, significantly impact employee performance and satisfaction. And then there’s occupational hearing loss, one of the most common work-related illnesses resulting in over $240 million in workers’ compensation every year.3 So, where are the next-gen acoustic solutions for enterprise?

The Sound of Music Work

In many industries, as you can imagine, employees work in loud environments on a daily basis. OSHA set legal limits on noise exposure in the workplace in the early Eighties, yet the government acknowledges that these aren’t low enough to protect workers from hearing loss. It’s estimated that 22 million workers are exposed to hazardous levels of noise on the job every year. By industry, that’s 46% of manufacturing, 51% of construction, and 61% of mining, oil and gas workers.4 Though employers are required to take steps to protect workers’ hearing, including providing hearing protection and training, investing in low-noise equipment and adjusting work shifts, some employers don’t and 25% of workers just don’t like wearing earplugs or earmuffs.5 The thing about hearing loss is that it’s irreversible, and we now know a lot more about the health effects of noise than we did when exposure limits were first set to 90 dBA over an 8-hour workday. Experts agree that regular exposure to anything over 75 decibels is enough to cause long-term hearing damage. (For reference, most equipment operates above 100 dB and all of the following are above 75: Bulldozer, crane, hydraulic breaker, electric drill, jet takeoff on a tarmac, riveting machine, and newspaper press. Also, keep in mind that a small increase in decibels represents a significant rise in noise and potential damage.) As for health effects, noise has been linked to heart disease, high blood pressure, sleep disturbance, and even adverse birth outcomes.6

The truth is we don’t fully understand the impact of noise on workers. Scientists are realizing that sounds much lower than 75 (and over shorter periods of time) contribute to long-term hearing loss and something called ‘hidden hearing loss’—a permanent reduction in neural response (loud noise kills ear nerve endings). I won’t go into the ecological devastation wreaked by human-created noises, but for work safety purposes, noise-induced hearing loss also reduces one’s ability to hear high frequency sounds like alarms and understand speech, impairing communication. If you’re wondering why a worker would neglect to wear hearing protection, current devices (foam and silicone earplugs, earmuffs, etc.) leave a lot to be desired. One of their major shortcomings is the need to remove them in order to communicate with others. That’s where active noise cancelling comes in.

The ear is where it’s at

Hearables are progressing rapidly thanks to the convergence of low-power components, smaller AI processors, advanced microphone arrays, and more. Touch and gesture controls are improving, and many hearables now have integrated voice assistants, can do real-time language translation, are controlled via smartphone, etc. Sensear, for one, makes intrinsically safe in-ear earplugs and headsets for industrial users. In addition to noise cancelling, the company’s SENS technology isolates and enhances speech, so the wearer remains aware of her surroundings while still being protected from harmful background noise. Every ear is unique and each person hears differently, so it makes sense that hearables are becoming more customizable, but there is plenty of room for innovation, increased connectivity and additional capabilities, bringing this familiar wearable form factor into offices and factories alike. Besides intelligent noise control, earbuds/plugs could be equipped with any number of sensors that track location, movement, biometrics and even emotion. (The ear is actually a good spot for monitoring pulse and electrical brain activity.) Imagine a hearable that cancels out certain sounds to reduce stress or an in-ear device allowing you to interact with artificial colleagues like AGVs and collaborative robots—the hearable category has great potential beyond noise cancelling, active or not.

Audio AR

The “last piece is audio,” as Raytheon’s Kendall Loomis said at EWTS 2019. At this year’s EWTS, Boeing, Con Edison and ExxonMobil all brought up hearing protection, but ‘audio wearables’ can also augment workers to help them work faster and smarter, with sound and/or auditory information changing based on the user’s context. At UPS, for example, audio wearables tell workers what to do, no training required. This is essentially audio AR, overlaying auditory information into the user’s environment, and it can be as simple as enabling a worker to hear a pre-recorded guide – one of your expert employees talking through a process – while on the job. AR isn’t purely a visual play; audio can be modified in real time based on tracked data, location or task and deliver safety, machine and other job-critical information from sensors or an ERP right into the worker’s ear.

Not surprisingly, Bose is taking a sound-first approach to AR with Bose AR-enabled products and apps that provide tailored audio content into users’ ears as opposed to a small screen. Bose Labs exhibited at EWTS 2019 to see how its sound technology might assist industrial workers and to meet enterprise AR hardware makers. See, immersive technologies are so effective for training because they’re experiential. Sound is a major part of the human experience, and the combination of visual and audio AR could be twice as powerful for the workforce.

3D Audio

Audio AR adds an extra layer of information, but what about sound in the virtual world? If VR is going to completely replace traditional enterprise training (and probably design) programs in the future, then it had better be as close to the real thing as possible. Since sound helps us orient ourselves in space (auditory spatial awareness), truly realistic VR must be able to simulate sound localization.*

*Occupational hearing loss negatively impacts spatial hearing, as well.

In case you’re wondering, surround sound is not the same as 3D audio because the listener and sound sources are, for the most part, in fixed positions. 3D audio, on the other hand, is full-sphere surround sound, moving along with the listener in the physical environment. In a 3D soundscape, the user would be able to understand where he is relative to the noises around him so he could, for instance, sense something happening behind him. For binaural recording, sound engineers typically use a dummy head, placing microphones in the dummy’s ears, but the listener is again in a fixed position. Another approach adds speakers at different heights. The complexity lies in the fact that your aural experience changes as you move through the world. Ambisonic mixing for virtual spaces requires enabling sound to adjust according to where the user and sound sources are in the VR world. The user(s)’ position relative to other sound sources and the sources themselves (objects) are in motion. Today’s VR audio tech uses specialized recording systems and algorithms to mimic lifelike sound. There are some ambisonic microphones on the market, as well as custom rigs of omni- and bi-directional mics, and a few audio companies have released encoding formats supporting 3D audio, but lifelike audio VR remains a few years away.

Beyond Sound

We’ve been using audio wearables for decades: There was Panasonic’s AM radio designed to be worn as a bracelet in the early Seventies and the Sony Walkman, which turned 40 in July. It’s now a habit to carry sound around with us, on our bodies, and yet sound is a secondary consideration in AR/VR. From augmenting one’s hearing against occupational hearing loss to enabling a technician in training to hear exactly how a healthy engine should sound and simulating all the sounds of assembly before a worker ever hits the plant floor, getting sound “right” in extended reality will only amplify the technology’s effectiveness in enterprise. And who knows, the addition of touch and smell might eliminate traditional job training forevermore.


  1. Prezi
  2. Oxford Economics
  3. CDC
  4. NIOSH
  5. Hearableworld
  6. The New Yorker

Main image source: Auditoryprotection.com


Enterprise Wearable Technology Summit 2020

The 7th Enterprise Wearable Technology Summit (EWTS) is October 20-22, 2020 in San Diego! Join hundreds of Fortune 1000s to try out the latest in wearable tech, including AR/VR/MR, body-worn devices, and even exoskeletons, and to learn how today’s biggest companies are profiting from and scaling the technology. More details, including program and the largest expo of industrial AR/VR/Wearable tech) to come on the EWTS 2020 website.

Using XR to See Underground: Interview with Arcadis’ Allison Yanites

Before EWTS 2019 went down last month, I had the chance to interview one of the event’s thought leaders. Check out our interview with Allison Yanites, Immersive Technology Lead at Arcadis, the global natural and built asset consulting firm.

Emily (BrainXchange): To begin, could you provide our readers with a little background on yourself and what you do at Arcadis? Also, when did you first encounter AR/VR?

Allison: I am the Immersive Technology Lead at Arcadis North America. I am currently working to find different ways that augmented reality, virtual reality and other related technologies can improve customer experience, health and safety, and quality of life. Before this role at Arcadis, I worked as a geologist on environmental remediation projects: understanding subsurface conditions such as layers of soil and rock, if any groundwater or soil contamination is present, and if impacts are static or still moving below ground.A big piece of that work was creating 3D visualizations of subsurface data to help our clients and stakeholders better understand the full picture of what is happening below ground and help determine the next steps to clean up any contamination.

A few years ago, our team developed a mixed reality visualization of one of these environmental sites, where our stakeholders could see and interact with a holographic image of the groundwater contamination of the site. That was my first real experience with immersive technology as an industry application, and it was a gamechanger for me. Working with our digital team at Arcadis, I wanted to look beyond just holographic visualizations of environmental models and see how much we can do with AR/VR across all of the types of programs Arcadis is involved with, how we can use immersive and 360 technology for design, engineering, project and management services across all markets.

E: So, you really start at the beginning of a project, with touring a site? 

A: It depends. On some projects, a lot of data has already been collected, such as sites that have been monitored for decades; on other projects we are collecting data in an area for the first time. Either way, we are taking a large collection of data and trying to understand the complex geological and chemical patterns underground, and ultimately, determine the best ways to remove chemical impacts at the site.

E: Can you speak a little more about Arcadis’ business and its customers (who they are)?

A: Arcadis is a natural and built asset consulting firm. We work in partnership with our clients to deliver sustainable outcomes throughout the lifecycle of their natural and built assets. We have 27,000 people in over 70 countries with expertise in design, consulting, engineering, project and management services, and we work with a wide range of markets and industries, including oil and gas, manufacturing, transportation, infrastructure and municipal water.

At Arcadis, our mission is to improve quality of life in the communities we serve. Whether that is by ensuring the environmental health of communities or reducing the amount of time people spend in traffic, we develop our solutions with our client’s end-users in mind. To design the most impactful solutions, Arcadis has committed to digitally transforming our business at every level of our organization. That includes training our staff on new digital capabilities, using cutting-edge technologies and then applying our subject matter expertise. We then use these tools and skills to better understand, and address, our client’s needs.

E: How is Arcadis using augmented and virtual reality? What pain points does AR/VR address?

A: Arcadis is using augmented and virtual reality in different ways across a variety of projects. Our immersive technology practice includes on-site visualization with different types of headsets, 360-degree photos, video and virtual tours, and remote assistance with AR capabilities. Generally, immersive technology is addressing four main pain points. The first is increased safety — for example, we can share access to difficult-to-reach sites with 360-degree imagery or livestream video, and bring additional staff or clients to the site virtually. Ultimately, we must keep people safe while still collecting as much data as possible. The second is speed of decision making for example, using AR to overlay a 3D design over an active construction site helps quickly identify any differences between the plan and the current project status. The third is cost reduction — for example, we can now virtually connect project teams and clients to remote sites. This reduces travel and helps reduce the costs associated with delayed communication or unplanned rework. And the fourth is enhancing stakeholder communication and collaboration — for example, virtual 360-degree site tours and remote assistance are virtually bringing staff, clients, and stakeholders to the site where they can participate in discussions about site conditions or questions on certain issues. AR/VR visualizations also greatly improve our communication of design plans or subsurface data visualization.

E: I imagine there are a lot of new demands for the built environment, especially with climate change. Do you think that AR/VR are unleashing more creativity, enabling designers to do things they’ve never done before?

A: Absolutely. There is a lot of power in using AR/VR to understand how the environment is changing, and how to prepare communities and businesses accordingly. AR and VR visualizations can communicate designs to stakeholders that address sustainability needs or flood and storm resilience. AR/VR technology also gives designers the flexibility to share their designs with stakeholders more clearly and effectively, with a greater level of detail, than ever before. When you use AR/VR to see first-hand how a flood level impacts homes and businesses, it takes on a greater urgency than it may have before. We are also using AR/VR technology for training situations, and many training scenarios are relevant to our changing environment and being prepared for the future.

E: How have customers received the technology? Was it easy for them to use? Have any clients adopted AR/VR for themselves?

A: We have had success applying immersive technology services, and it’s exciting to see this technology expand and scale in our industry. At the same time, we are continually working to apply the right technologies for the right projects, and find new ways to solve problems for clients. These technologies are a moving target; they evolve so quickly. It seems like every few weeks there is a new product, software/hardware capability, or integration that opens new opportunities for how AR/VR can be applied. In addition to gaining traction and adoption with the services and capabilities we have established, we are constantly evaluating how we can solve emerging client challenges with new and immersive technology.

E: What was piloting like? Was there an actual test period and were there any major challenges to using the technology at Arcadis?

A: Several years ago, we started with a few different pilots and tested different AR glasses, VR headsets, 360-degree cameras and various software programs to develop content. Each of the solutions or services that we have explored has been rigorously tested, and if appropriate is then developed internally or in partnership with our clients. We are still doing pilots because the space is evolving. With one particular workflow there might be an update in either the hardware or the software that offers a new opportunity, so we’ll go in and test that. The pilots are really tied to the problems we can solve and the solutions we can bring to our clients, working with them to customize what we do with these different tools.

E: Where does the content come from?

A: So far, we have developed everything on our own. We use plugins and software to create content, but the content is coming from our own project locations and 3D designs, like wastewater treatment plant designs, environmental remediation sites or highway infrastructure designs. We already work in those spaces so we have the data sets, which we can use to create the AR/VR visualizations. Through our FieldNow™ program, we have also committed to collecting data 100 percent digitally, which means we can now apply this technology to more projects than ever before.

E: How do you measure the ROI of AR/VR at Arcadis?

A: ROI varies from project to project, but does generally come back to the four KPIs: Increased safety, speed of decision making, cost reduction, and enhanced stakeholder communication and collaboration.

E: How has Arcadis handled the security part of adoption?

A: Arcadis takes data security very seriously. Our group works with our IT department to thoroughly vet each technology against industry security standards. Additionally, our use of each of these technologies is also typically evaluated by our clients to make sure it is compliant with their security protocols. Security is always a leading factor in any new technology we adopt.

E: Are there any applications you’re hoping to test in the future at Arcadis?

A: We are constantly evaluating what we can do to exceed our client’s changing expectations. As new applications and technologies become more accessible, we want to make sure we are equipped to address both traditional and emerging client challenges.

Beyond finding new ways to integrate software platforms, we are starting to leverage the internet of things and wearable technologies more frequently. As a large company that is involved in many different industries, Arcadis uses a lot of different software programs. For each software program (3D design visualization, data analytics, program management system, etc.), we develop unique workflows to create AR/VR and 360 visualizations and/or integrate with a program management system. We are always looking for new software products or software updates that make it easier to integrate AR/VR into our daily routines.

E: With sensors in the environment and wearables, I assume you’re gathering new kinds of information for these models?

A: Absolutely. We are using sensor data, which provides real-time results that can be fed into our data analytics platforms and visualized in different ways. We are also excited about platforms that can house data and be updated in a seamless way, so a whole project team across the globe has access to one central data set.

E: What are your greatest hopes for this technology?

A: As immersive technology becomes more mainstream and awareness keeps spreading about its value for industry, it is exciting to see how many ways immersive technology is adopted and applied. This technology is still so new, I am excited to follow its evolution and see what will be possible in five, 10 and even 30 years. My hope is that as the technology starts to deliver more and more value to businesses, we also see increasingly creative ways to improve quality of life in communities around the world.

All of the News Out of EWTS 2019

The 2019 Enterprise Wearable Technology Summit (EWTS) took place last week in Dallas. While it was the largest EWTS yet – with over 1,000 attendees and 60+ exhibitors – the show still managed to retain its characteristic intimacy. 2019 was also the most diverse year on the EWTS expo floor, which showcased a variety of hardware and software including AR/VR devices, exoskeletons, haptic gloves, and training platforms. Past and longtime EWTS attendees caught up, first timers were exposed to the top industrial AR/VR and wearable solutions, and everyone took away something to fuel the next year of innovation at work.

While this event has always been about the end user, the immersive/wearable tech market has also grown here. This year, a number of exhibitors chose to announce new partnerships and launch products at EWTS. Here are all of the announcements that came out during the show:


ThirdEye launched its X2 mixed reality glasses, a lightweight enterprise AR headset retailing at $1950. What sets the X2 apart is its light weight of only 9.8 ounces (the smallest on the market). Designed for small, mid-sized and large-scale companies, the X2 is strictly for enterprise. (For comparison, HoloLens 2 costs $3500 and the Magic Leap One $2295.) Learn more


Iristick announced that its Iristick.Z1 smart safety glasses are now compatible with iOS smartphones (in addition to Android), which is great news for early enterprise adopters with a strict iOS company policy. Learn more


Qualcomm announced a new initiative designed to help XR companies accelerate the development of business solutions. The Qualcomm XR Enterprise Program – part of the company’s broader Qualcomm Advantage Network – connects XR headsets based on the Qualcomm Snapdragon XR Platform together with enterprise solution providers in a broad array of industries. Learn more


The three companies announced and showcased a new, fully wireless PC concept called Boundless XR, a precursor to Boundless XR over 5G—an untethered walking VR experience that will enable users to configure and explore a range of Cadillac vehicles in high definition thanks to ZeroLight, without the need for external sensors.

Using a Pico protoype headset, Qualcomm replicated the high bandwidth and low latency of 5G at its booth by rendering on a PC and streaming directly via a local 60-GHz wireless connection. The 5G version will move from local hardware to 5G Mobile Edge Compute (MEC). Learn more


The company launched a new look, new site and new product at EWTS 2019. “EWTS is where the seed of the idea was planted by innovative companies saying, ‘This is the solution we need’. It’s the perfect place to share our new SaaS product with the XR industry. Learn more


Jujotech introduced and demonstrated Fusion Inspect, “the first market solution for smart headsets with rich media customizable reports and fully integrated with remote assist (Fusion Remote).” Fusion Inspect provides hands-free inspection and reporting to improve AEC and Telco productivity. Learn more


TeamViewer announced the integration of TeamViewer Pilot remote connectivity platform with RealWear, Vuzix and Epson smart glasses. Learn more


If Magic Leap had to choose one event to establish itself in enterprise, EWTS was the right choice. At the same time, Magic Leap announced Concepts, “free apps with limited functionality meant to garner feedback, experimentation and support from the broader [ML] community.” One new concept already released is the “Wall Street Journal Stock Data Concept” from The Dow Jones Innovation Lab. Learn more


The company announced the new BeBop Sensors Forte Data Glove Enterprise Edition, a comfortable, one-size-fits-all, wireless VR/AR haptic glove built for business. The glove provides real-time haptic feedback allowing users to “feel” textures and surfaces and move around digital objects. BeBop also won a U.S. Air Force contract. Learn more


Epson’s see-what-I-see remote assistance solution Moverio Assist is now commercially available at a low cost. Using the Moverio smart glasses, wearers can view high-quality instructions, photos, PDFs and videos while communicating with remote personnel in real time. Learn more


*Image source: J&J Studio

Education, not Automation, is the Problem: 21st-century Job Training with XR

Many people fear the day when drones, robots and self-driving cars will replace human workers. This is understandable and it’s not only delivery drivers who have reason to fear—computer algorithms (artificial intelligence) could potentially replace entire departments of human employees. Though many industries and job professions are experiencing existential crises, the future will not be jobless. It will, however, be quite different, with new jobs and more employee turnover (in pace with advancing technologies) requiring humans to be able to quickly and effectively train and retrain for new roles.

Today’s workforce is aging. Simultaneously, current workers and new members of the workforce (millennials and soon Gen Z) are being forced to compete against cheaper labor around the world, against technology and automation, etc. in a rapidly changing global (and gig) economy. There isn’t a lack of jobs; in fact, as certain jobs are being automated, other positions requiring higher (and often more technological or advanced) skills are being created. Today, millions of jobs requiring a trained human touch are going unfilled because there aren’t enough workers equipped with the skills to fill them. The problem isn’t automation; it’s education. What we have is a training problem and the solution is extended reality. This is why some of the world’s biggest employers are going virtual to build the workforce they need now:


Walmart is the largest company in the world by revenue, with 3,500 Walmart Supercenters in the U.S. alone and 2.2 million employees worldwide. How does a company of Walmart’s size and global presence maintain quality training across its stores? Virtual Reality.

Walmart isn’t just testing VR for training. With the help of STRIVR, the retail giant has been implementing VR training, purchasing 17,000 Oculus Go headsets in 2018 to roll out a nationwide virtual reality (soft skills) training program. 10,000 Walmart employees are using the VR platform already, and it doesn’t seem like adoption is slowing down. By putting trainees into simulations of real-life situations, Walmart has been able to reduce the travel costs associated with traditional training facilities. The company is even applying VR to determine promotions, incorporating the tech into the interview process to help identify employees with managerial potential.

Hilton Hotels

In addition to using virtual reality to allow guests to preview rooms, the hospitality giant is turning to immersive technology to modernize training for its upper-level employees. Last year, Hilton worked with a third party (SweetRush) to film a 360-degree VR experience in a full-service Hilton Hotel. The simulation allowed corporate staff to experience a day in the life of a Hilton employee, the idea being to help them understand the physically challenging and complex tasks of day-to-day hotel operations. Instead of flying executives from across Hilton’s 14 brands (Hilton operates in 106 countries and territories), executives can put on a VR headset and experience what it’s like to clean a hotel room like a real member of the housekeeping staff.

In this case, Hilton wanted executives to get a sense of the complex demands made of the company’s staff at its 5,300 properties and to encourage empathy. Role playing is a key component of hospitality training; relying on a network of trainers to deliver bespoke training around the world, however, is expensive and doesn’t ensure consistent training across the Hilton brand. The company is planning to expand its use of VR training, including piloting a conflict resolution program designed to improve service recovery.

Preparing for danger

JLG Industries describes itself as a manufacturer of “mobile aerial work platforms.” If that doesn’t make your heart race, then I guess you don’t suffer from Acrophobia. JLG designs, builds and sells equipment, including electric boom and scissor lifts used on construction sites worldwide. From a quick Google search, it’s evident that poor training on these machines or a mistake in assembly can lead to a lawsuit, so it’s not surprising that JLG is using VR to train operators of its boom lifts.

How does one safely learn to operate vehicles from platforms up to 185 feet off the ground and on giant arms? With JLG’s networked training program built by ForgeFx Simulations, multiple trainees in multiple locations can operate virtual boom lifts in the same virtual construction site without ever leaving the ground. JLG customers could also benefit from the program, which is much safer than training on a real machine and more efficient to boot.

In a similar use case, United Rentals, the world’s largest equipment rental company, said it would begin offering VR simulators this year through its United Academy. United began testing VR for training new hires at the end of 2016. Instead of lectures and pictures of construction sites, VR was able to transport them to the job site. Standing on the edge of the virtual job site, employees were given two minutes to observe the environment and identify any missing equipment. The user then had to make his or her pitch to the construction boss (an avatar). In these early tests, United was able to shorten its typically week-long training program by half.

More recently, it was reported that United Rentals is offering VR training to help its customers teach their own employees how to operate scissor lifts and other machines.

Six Flags

Six Flags, a global leader in the attraction industry, employs nearly 30,000 seasonal workers to move millions of people through its parks during the busiest times of the year. That means every year, Six Flags must train tens of thousands of people to work in admissions, retail, ride operations, and more. In 2015, Six Flags began seeking alternatives to traditional instructor-led training, which wasn’t adequately preparing temporary hires. Fearing that PowerPoint presentations and low-tech audio/visual approaches weren’t adding to the organization, Six Flags injected tablet technology into training at two of its properties. The learning module moderated the flow of training by discovery, introducing videos, a simulated tour experience, safety quizzes, and more using gamification. In post-pilot surveys, 89% of participants believed the tablets improved their understanding of the training material and 91% agreed that Six Flags needs more tech in its learning and development programs.

Further transitioning from instructor-led to more engaged training, Six Flags has since added AR and VR to the mix, creating a virtual park tour with guest hot spots that trainees can experience without physically leaving the classroom. You can imagine the VR tour is useful at Six Flags properties in colder climates or under expansion. The ultimate goal for the theme park giant is to increase engagement and improve retention by creating a more realistic job preview process in onboarding.

According to a new study by BAE Systems, 47% of young people (aged 16-24) believe their future job doesn’t exist yet. BAE also predicted that the top jobs in 2040 will involve virtual reality, artificial intelligence, and robotics. Are students learning the skills that will be in demand 20 years from now? Only 18% feel confident that they have those skills, while 74% believe they aren’t getting enough information about the kinds of jobs that will be available in the future.

Some of the world’s biggest companies are heavily investing in augmented and virtual reality training solutions. Not all have purchased headsets in the tens of thousands like Walmart, but companies that want to maintain a competitive edge are looking to immersive technologies. AR/VR – the ability to create any number of lifelike simulations without real danger or risk, to simulate any working environment or situation anytime, anywhere – is a gamechanger not just for the organizations trying to bridge today’s skills gap but especially for young people anticipating the jobs of tomorrow that don’t yet exist.


Image source: VR Scout

2019: The Year of the Big Pivot Towards Enterprise AR/VR

It’s a shame that AR/VR was overhyped in 2018 because in 2019 the technology is a fixture in enterprise.
I’ll be blunt: Augmented, mixed and virtual reality were overhyped in 2018. While 2018 turned out not to be the year of AR/VR; please don’t roll your eyes when I tell you that 2019 is the year at least for enterprise, and of that I have no doubt.
Here are a few of the signs:

  • More than half of the announcements made at AWE USA 2019 (a staple on the AR/VR calendar) were enterprise-related
  • Some of the world’s biggest consumer tech companies are now entering the immersive tech space, primarily eyeing enterprise
  • The top names in consumer VR are also heavily courting the enterprise

Why? Why are AR/VR hardware and software companies pivoting to enterprise? The answer is obvious: Because enterprise is where the money is. Both AR/VR technology providers and the world’s best-known companies (end users) are making/saving big.

If you follow enterprise AR/VR, you’re no doubt familiar with Google (Glass), Microsoft, and PTC (Vuforia). Other longtime players include Atheer, Epson, HPE, LogistiVIEW, ScopeAR, RealWear, ThirdEye, Ubimax, Upskill and Vuzix. Qualcomm, Honeywell, and Toshiba (dynabook) have become fixtures on the scene, as well, and by that I mean regular exhibitors at EWTS, the only event dedicated to enterprise use of immersive and wearable technologies. Newer sponsors include Jujotech, Pico and RE’FLEKT, along with Bose, HTC and Lenovo, joining top enterprise wearable device and industrial exoskeleton makers on the EWTS roster.

Doesn’t Bose make headphones?
Yes, they do. Bose is known as a consumer audio vender, but it also makes Bose Frames, which provided exclusive audio content and set-time notifications to desert-goers at this year’s Coachella music festival. Founded in 1964, Bose is taking an audio-first approach to augmented reality today with Bose AR, not only at concerts or in automobiles but in meeting rooms, too. Audio AR is a natural fit in the Industrial Internet of Things.

The pivot
In April 2019, Oculus introduced the expanded Oculus for Business, an enterprise solution designed to streamline and grow VR in the workplace. The expanded solution adds Oculus Quest to the hardware lineup and provides a suite of tools to help companies reshape the way they do business with VR.
The following month, Lenovo launched an enterprise AR/VR headset, the ThinkReality A6, immediately positioned as a rival to Microsoft’s HoloLens. Articles spoke of Lenovo as “just the latest manufacturer to develop an AR device aimed at enterprise.” On the heels of Lenovo’s first foray into enterprise XR, HTC announced the HTC Vive Focus Plus, a new version of its Vive Pro that will only be made available to enterprise customers. Furthermore, HTC’s Vive X accelerator has been “pouring money” into enterprise VR startups.

The proof is in the toolbox
The digital transformation isn’t here; it’s underway at hundreds of companies, including household names like Ford, UPS, and Walmart.
Every year, enterprises take the stage at EWTS to share how they’re using wearable and immersive technologies. They share their experiences and best practices, their successes and failures, and then they return the following year. These “veteran speakers” are another sign of AR/VR’s secure position in the present and future of work: AGCO, Boeing, DHL, Lockheed Martin, and Porsche come back year after year to update peers from Bayer, BP, Caterpillar, Coca-Cola, Johnson & Johnson, and other Fortune 500 companies on the latest applications for the technology in their operations. EWTS speakers span industries and sectors: Airbus, BMW, Chevron, Colgate-Palmolive, Con Edison, Duke Energy, General Electric, Gensler, jetBlue, John Deere, Lowe’s, Molson Coors, Southwest Airlines, Thyssenkrupp, Toyota, United Technologies, etc. And new faces join every year—This year’s event will welcome AIG, Amazon, American Airlines, Bridgestone, Exelon, Holiday Inn, Philip Morris, Sanofi, Six Flags, and more to the stage. It’s a cycle: Attendees become users who become speakers, and the technology continues to advance.

Beyond pilots
Lockheed Martin has been a longtime advocate of AR/VR, benefitting so much from mixed reality that it’s now teaming up with Microsoft to sell mixed reality apps to other businesses in the airline and aviation industry. Rollout is growing at BMW, too: The luxury auto manufacturer is providing all its U.S. dealerships (347 BMW centers and select MINI dealers) with Ubimax Frontline running on RealWear HMT-1 head-mounted devices. Shell is also deploying RealWear’s HMT-1Z1 through Honeywell in 12 countries and 24 operational sites. And last year, Walmart announced it was putting 17,000 VR headsets in its U.S. stores for employee training. These aren’t mere pilots. At AGCO, Boeing, and other large manufacturers augmented reality is a standard workforce tool for a variety of tasks in multiple areas of operation. In the last three months alone, Fortune 500 companies in the news for using AR/VR included Audi (Volkswagen), ExxonMobil, Nissan, and even Farmer’s Insurance. Deloitte estimates that over 150 companies in multiple industries, including 52 of the Fortune 500, are testing or have deployed AR/VR solutions. The 6th annual EWTS is the proof.

Reality check
It helps that the tech is steadily improving, of course. This was the first year that I walked around AWE and was truly amazed by the quality of immersive experiences I tried. So, here’s a reality check: AR/VR is having an impact across business and industry and it’s not going away. It’s not future tech; it’s now. And it’s not just AR/VR glasses and headsets but body-worn wearables, as well, sometimes in conjunction with VR as well as in applications warranting an entire day – the third day of EWTS 2019 – devoted to below-the-neck and safety wearables. We’re talking biometric, environment and location tracking, employee ergonomics, partial and full-body exoskeletons—it’s all here today in the enterprise.


Image source: The Verge

Is Digital Transformation for Men? Female Factors in Wearable Tech Design

In 2015, NASA celebrated over 50 years of spacewalking. Three years later, in March 2018, the agency called off the first all-female spacewalk due to a shortage of smaller-sized spacesuits. The walk-back led to a Twitter storm, with women sharing hundreds of stories of their own ill-fitting work uniforms and oversized ‘standard’ gear; but “It’s not just spacesuits,” one woman tweeted: “It’s public spaces like bathrooms, cars, cockpits, office air conditioning, microwave installation heights, Oculus, military fatigues…an endless list.”

In December, I wrote about the phenomenon of patriarchal coding. A feeling that today’s VR headsets were not designed with women in mind set me on a trail of research that revealed I’m not alone in feeling this way and that the majority of the products and systems we use every day are designed by and for men. This phenomenon affects every aspect of women’s lives – it even endangers our lives – and it’s unintentional for the most part, which makes it all the more frustrating. Sexism is so ingrained in our society that women’s unique needs and biology (like the fact that we have breasts) are excluded from reality, even of the virtual kind.

My main point then was that wearable technologies – the body-worn sensors being integrated into organizations’ EHS efforts, exoskeletons taking a load off workers’ backs, and VR headsets being hailed as the future of job training – exhibit coded patriarchy and risk further alienating the female workforce. Wearables that are replacing or supplementing traditional PPE (personal protective equipment) cannot succumb to the same biased or negligent design as have automobiles, office buildings, etc., for the future economy and growth of the workforce depend upon improving job prospects and working environments for women.

The history of man

Women and the female perspective are largely missing from human and world history (as is often the non-western point of view) and entirely absent in the fundamental research underlying the foundations of modern life, including economics and urban planning. The star of the show is “Reference Man,” a 154-pound Caucasian male aged 25 to 30, who has been taken to represent humanity as a whole when it comes to the design of everything from power tools to the height of a standard shelf. Take medicine: Though women process drugs differently, medications are tested only on men. Cars: For decades, car safety testing has focused on the 50th percentile male. The most common crash-test dummy is taller and heavier than the average woman, with male muscle-mass proportions and a male spinal column. This is how “standard seating position” was determined. Women, however, sit further forward in the driver’s seat and thus are 47% more likely to be seriously injured in a car crash. In 2011, the US began using a female crash-test dummy, though not an anthropometrically correct one. Testing with a pregnant dummy? Forget it.

Beyond product ergonomics

It’s annoying that so many gadgets we use are one-size-fits-men, and it’s dangerous. The world is less safe for women because we haven’t been factored into the design of not only physical products but also the software behind everything. Consider navigation apps, which provide the quickest and shortest routes to a destination, but not the safest; or voice recognition and other AI tech, which is male-biased and also becoming indispensable to how we interact with our devices and how systems make major decisions affecting humanity. Google’s voice recognition software? 70% more likely to accurately recognize male speech. Apple’s Siri? When she launched, she could help a user having a heart attack but didn’t know what “I was raped” means. (Side note: the heart attack symptoms healthcare professionals are taught to identify are actually male symptoms.)

Last year, Amazon had to scrap an experimental recruiting tool that taught itself to prefer male candidates for software development and other technical jobs. How did this happen? Because the computer model was trained to observe patterns in resumes from the previous ten years, most of which were submitted by men since the tech world is notoriously, overwhelmingly male. What’s frightening is that in a 2017 survey by CareerBuilder, over half of U.S. HR managers said they would make artificial intelligence a regular part of HR operations within five years. That means women will have to combat unfair algorithms in addition to unconscious bias in order to advance in the workforce. IBM CEO Ginni Rometty says it’s up to businesses to prepare a new generation of workers for AI-driven changes to the workforce. In a world in which AI will impact – and perhaps determine hiring – for every existing job, the fact that women and minorities are disproportionally left out of the teams behind the AI revolution is tragic.

The data gap at the heart of the workplace 

Occupational research has traditionally focused on male workers in male-dominated industries. Few studies have been done on women’s bodies and job environments, so there is little occupational health and safety data for women. The uniforms in most professions are therefore designed for the average man’s body and the why behind trends like the increasing rate of breast cancer in industry remains unknown. Relying on data from studies done on men may explain why serious injuries in the workplace have gone down for men but are increasing among women workers. This despite that, for the last three years, women have been entering the workforce at more than twice the rate of men. (You do the workers’ comp math, employers.)

When we talk about using wearables for EHS applications, oftentimes we’re speaking about body-worn sensors that can detect biometric and environmental data affecting a worker’s health and safety. The software behind these applications might send an alert to the worker or wearer when a reading reaches a certain threshold, but how is that threshold – the danger zone – determined? Say we’re tracking a worker’s exposure to a particular chemical. Women and men have different immune systems and hormones; women also tend to be smaller, have thinner skin, and have a higher percentage of body fat than men—differences that can influence how chemicals are absorbed in the body. Without female-specific data, the threshold at which a wearable device is set to alert the wearer would likely be higher than the toxin level to which a female worker can be safely exposed, putting women at greater risk of harmful exposure. The problem is two-fold: We don’t have data about exposure in “women’s work” and we’re clueless when it comes to women (increasingly) working in male-dominated industries. At this point, it would take a working generation of women to get any usable data on long-latency work-related diseases like cancer.

No PPE for you

Construction is one of those male-dominated industries in which standard equipment and PPE has been designed around the male body. Though there is little data on injuries to women in construction, a study of union carpenters did find that women have higher rates of wrist and forearm sprains, strains and nerve conditions than their male counterparts. To comply with legal requirements, many employers just buy smaller sizes for their female employees but scaled-down PPE doesn’t account for the characteristics (chests, hips and thighs) of a woman’s body. Moreover, it doesn’t seem cost-effective for employers to meet the order minimum for those sizes when women make up less than 10% of the construction workforce. Giant overalls are one thing, but the straps on a safety harness not fitting around your body? How is a woman supposed to perform at the same level as a man if her clothing and equipment are a hindrance? If oversized gloves reduce her dexterity, a standard wrench is too large for her to grip tightly, or her overly long safety vest snags on a piece of equipment? Already a minority in the sector, women don’t usually complain about ill-fitting PPE. Instead, they make their own modifications (with duct tape, staples, etc.). And it’s not just women; dust and hazard eye masks designed for the Reference Man also put many men of color at a disadvantage.

Of course, it doesn’t have to be this way. A standard-sized bag of cement could be made smaller and lighter so that a woman could easily lift it. Exoskeletons might be a solution, but so is going back to the drawing board: Jane Henry’s SeeHerWork, for example, is an inclusive clothing line for women in fields like construction and engineering, fields with lucrative, equal-pay careers and massive labor shortages—fields that need women.

Designing the workplace

Guess what? Men are the default for office infrastructure, too, from the A/C (women tend to freeze in the workplace, which hurts productivity) to the number of bathrooms and stalls (a single restroom with urinals serves more individuals). According to the Bureau of Labor Statistics, women represent nearly two-thirds of all reported cases of carpal tunnel syndrome, which indicates that workstations are less ergonomic for women. Open office plans are conducive to socializing and breaking down hierarchies, right? No, they actually encourage sexist behavior. A 2018 study documenting the experiences of women in an open office designed by men – lots of glass, identical desks, group spaces – found that the lack of privacy created an environment in which female workers were always watched and judged on their appearance. Designers today are beginning to use virtual reality to design factory layouts and workstations, even assembly processes, but that doesn’t mean they’re factoring in female anatomy or putting headsets on women workers to get their input.

I spoke with Janelle Haines, Human Factors Engineer at John Deere, who uses virtual reality to evaluate the ergonomics of assembly, about her experiences performing evaluations on women workers. Most of the people she gets to put in a VR headset are male; however, there are a few female employees available at times for evaluations. “Fitting the job to the worker hasn’t [always] been a focus. Even in the last fifteen years that I’ve been studying ergonomics, there has been a huge shift in learning to focus on ergonomics. It has become a kind of buzz word…There are some jobs that have been at John Deere for years and years, since we started building combines, that aren’t a great fit for women, but going forward with new designs we’re using VR to make sure the workstations and what we design do work for women.” Ergonomics aren’t a new area of study, but Janelle points out a promising shift in thinking and a deliberateness that’s necessary “going forward.”

The future of work: Uncomfortable = unproductive

Smartphones have become standard work tools in many jobs. Men can use the average smartphone one-handed; women cannot (smaller hands). This kind of oversight cannot be carried into the next wave of mobile: Wearable technology. That women have different muscle mass distribution and vertebrae spacing, lower bone density, shorter legs, smaller wrists, lower centers of mass, etc. matters when it comes to the design and application of wearable devices like partial and full exoskeletons, connected clothing and gear, augmented reality smart glasses, and virtual reality headsets. Early decisions in developing transformative technologies can create a weak foundation for the future of that tech.

Already women are at a disadvantage in VR. As far back as 2012, researchers found that men and women experience virtual reality differently and a growing body of research indicates why. Motion parallax (preferred by men) and shape-from-shading (preferred by women) are two kinds of depth perception. What creates a sense of immersion for men is motion parallax or how objects move relative to you, and this is easier to render or program in VR. For women, it’s shape-from-shading, meaning if a shadow is ‘off’ it will ruin the immersive experience for a woman. As shape-from-shading is more difficult to emulate, most VR tech uses motion parallax. Then there are the poor ergonomics of most VR headsets for women (too heavy, too loose, etc.). Why does this matter? Because VR is being hailed as the future of learning and job training; VR is going to be crucial for filling millions of vacant positions and for upskilling the workforce as automation advances. When one half of the population experiences the technology differently than the other half, that’s an unequalizer, especially when all indications point to people spending more time in VR in coming years.

Stop defaulting to men 

The long legacy of researchers overlooking women – not wanting to pay for double the testing – has looming implications at a time when we’re collecting data from more and more ‘things’ and powerful computers are making important decisions for us. It’s bigger than a spacesuit; we’re making decisions based upon biased, incomplete data, feeding that data into algorithms that can exacerbate gender and other inequalities, create risks among certain populations, and encode prejudices into the future. The answer? First, inject more diversity into the labs and back rooms where the future is being designed and engineered. Second, hire female designers and stop using men as a default for everything!



In writing this article, I drew heavily on the efforts and writings of a number of inspiring women; including Caroline Criado-Perez, author of Invisible Women: Data Bias in a World Designed for Men,” Abby Ferri of the American Society of Safety Professionals, and Rachel Tatman, research fellow in linguistics at the University of Washington.


The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 6th annual EWTS will be held September 17-19, 2019 in Dallas, TX. More details, including agenda and speaker lineup, available on the conference website.

All the Enterprise News Out of AWE USA 2019

One of the major takeaways from the 10th annual AWE last week was that enterprise is where the AR/VR market is growing. It was clear that there are serious – and real – enterprise applications providing ROI today to both large and small companies. AWE USA 2019 also saw a number of launches and updates from enterprise AR/VR solution providers. Catch up on all the enterprise news below:


Atheer announced expanded support for devices that can control and provide input to smart glasses via gestures. The enhanced support for gestures – achieved with advanced machine learning tech – makes it easier to control more types of smart glasses outside of the limited group of smart glasses with dedicated depth sensors and enhances other modes of interaction. Learn more


In addition to being on track to have over one million BoseAR-enabled devices in consumer hands by the end of the year, Bose – an unlikely enterprise player – is building an industrial BoseAR wearable for loud, noisy and distracting work environments. Learn more at EWTS 2019 Sept. 17-19 in Dallas, Texas, where Bose’s Ilissa Bruser is speaking. Bose will exhibit at EWTS.


Jujotech’s latest solution Fusion AR with WorkLogic  provides connected workers on the job with quick access to IoT-enabled machine information and remote expert guidance. WorkLogic, an open API, works within Fusion AR to send digital work instructions and checklists to AR glasses/headsets, tablets and smartphones. Learn more


Lance-AR launched at AWE! The consulting and integration company specializes in AR enablement for the enterprise market. Its Enterprise AR Deployment Services are focused on enabling scaled enterprise deployments that deliver real, near-term value with the AR hardware and software available on the market today. Learn more


LogistiVIEW announced its partnership with Fetch Robotics, which combines the AR company’s Connected Worker Platform with Fetch Robotics’ autonomous robotics solutions. The combo enables robot-assisted processes to achieve a “complexity and scale rivaling traditional fixed automation.” It also costs less and is more flexible than traditional automation. Learn more


Logitech’s VR Ink Pilot Edition – still a 3D-printed prototype – is like an oversized stylus that lets you draw and design in virtual reality. You can trace designs in 3D space or sit at a table and draw on its surface. The harder you press on the button or tip of the stylus, the thicker the line. The tech offers more precision than a game controller and is more natural to use for creators and designers. Logitech says it’s close to a final design. Learn more


Qualcomm’s Snapdragon Smart Viewer reference design debuted last week. Built on the Snapdragon XR1 Platform, Smart Viewer is designed to help speed up product development for AR/VR headsets. It takes advantage of the XR1’s processing power to enhance the content AR/VR headsets can offer to consumers and enterprise, distributing the workload and tapping into the compute power of host devices. Additional features like eye tracking and six degrees of freedom (6DoF) controllers unlock even more immersion. Learn more


The Munich-based company announced that the REFLEKT ONE ecosystem now includes Siemens Teamcenter. Siemens customers and business units can easily source live data from the Siemens PLM system for content creation on the REFLEKT ONE platform. The connection should dramatically increase the speed and accuracy of AR/MR content creation. Learn more


Rokid provided a sneak peek at its next-generation mixed reality glasses called Rokid Vision, which are distinguishable from the Rokid Glass (now ready for mass production) thanks to a dual-screen display and 6DoF technology. The sleek design includes an RGB camera, two depth cameras, and a simultaneous localization and mapping (SLAM) module that offloads complex 6DoF calculations from the mobile CPU. Rokid is tethered, requiring you to connect it to a USB-C device with DisplayPort support. Expect the Rokid Vision SDK to be released in the third quarter of 2019. Learn more

Scope AR

Scope AR made a few announcements at AWE, including a new customer (medical device company Becton Dickinson) and an expansion of its integrated AR platform at Lockheed Martin. The company also launched an upgraded version of its WorkLink platform, including session recording. This addition means users can capture and save live sessions between themselves and an expert (the live remote video support calls and AR annotations) for later reference—a great way to retain and pass on tribal knowledge. Learn more

ThirdEye Gen

The creator of the world’s smallest MR glasses (X2) announced a new Software Partner Generate Program intended to expand its developer community and provide exclusive partnership opportunities to individual developers as well as large AR/MR software companies. Learn more


Ubimax expanded its industry-proven Frontline platform to support HoloLens. The integration of HoloLens 2 into Ubimax Frontline extends the benefits of Ubimax’s software into mixed reality environments, making it easy to enrich existing and new AR workflows with holographic 3D objects. Preview here


Varjo was certainly a crowd favorite at AWE, where the company announced and demoed its new industrial-grade headset. Varjo says XR-1 Developer Edition delivers on its promise of making mixed reality indistinguishable from the real world. The video pass-through headset is capable of producing images with a resolution of more than 4K per eye, making the XR-1 the only device that can seamlessly blend the real and the virtual. Varjo will begin shipping XR-1, which connects via wire to a powerful PC, to developers, designers and researchers in the second half of 2019.

Varjo has also teamed up with Volvo, which uses its tech to test-drive virtual car designs on the road. Check out VentureBeat for more specs and examples of industrial applications for XR-1. In addition, hear from Volvo’s Amanda Clarida at EWTS 2019.


Wikitude now supports all leading wearable technologies, not only standalone devices like HoloLens but also a new spectrum of tethered smart glasses starting with the Epson Moverio BT-35E. This means users can engage with AR content wearing head-mounted devices connected to 5G smartphones. Learn more


The smart eyewear maker revealed that the Vuzix M400 Smart Glasses are now available for purchase at a cost of $1,799 as part of an early adopters program. The device, however, won’t actually ship until September. With a larger memory profile, improved voice recognition/noise cancelling, a new touchpad, built-in GPS, OLED display, and Qualcomm Snapdragon XR1 at its core, M400 promises improved interactivity, power consumption and thermal efficiency. Learn more

Catch Atheer, Bose, Lance-AR, LogistiVIEW, Qualcomm, RE’FLEKT, ThirdEye Gen, and other leading enterprise AR/VR solution providers at EWTS 2019.


The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 6th annual EWTS will be held September 17-19, 2019 in Dallas, TX. More details, including agenda and speaker lineup, available on the conference website.

XR in HR: AR/VR for a Different Kind of Training in the Workplace

A report released last year by the Equal Employment Opportunity Commission (EEOC) contained some shocking findings:

  • 45% of harassment claims made to the EEOC are sex-based.
  • At least one in four women experience sexual harassment in the workplace.
  • Around 90% of employees who experience harassment – whether sexual or on the basis of age, disability, nationality, race or religion – do not file a formal complaint.
  • 75% of victims who do report harassment experience retaliation.

The bottom line

Every year, sexual and other types of harassment cost companies dearly in time and money. According to the Center for American Progress, workplace discrimination costs businesses approximately $64 billion annually. Hostile work environments also negatively impact productivity, contribute to high turnover, and harm a company’s reputation. And it’s not just harassment. According to McKinsey, unconscious bias is a 12 trillion-dollar issue, which means we could add $12 trillion to the global GDP by 2025 by ‘simply’ advancing gender parity and diversity in the workplace. Gartner finds that inclusivity is profitable, especially at the executive level—inclusive companies outperform industry standards by 35%, generate 2.3 times more cash flow per employee, and produce 1.4 times more revenue. Evidently, diversity pays in money, innovation, decision making, and recruitment.

In compliance with federal and state laws, Fortune 500 companies and startups alike spend more than $8 billion on anti-harassment and diversity training each year. Nevertheless, the above stats are not improving; in fact, at current rates, it will take over a century to achieve gender equality in the workplace. Lab studies show that today’s methods for diversity training can change a person’s attitude for only about 30 minutes and can actually activate a person’s bias. Harvard studies of decades’ worth of data back this up, showing that diversity training is largely ineffective and even counterproductive.

Corporate diversity programs are failing. Harassment training at work is not making an impact. Only 3% of Fortune 500 companies today disclose full diversity data, while 24% of employees say their superiors fail to challenge sexist language and behavior in the office. What to do?

Current methods

Most onsite sexual harassment training consists of a speaker, video and/or awkward roleplaying. There are also classroom-style slide presentations, seminars, written content, and online courses. In other words, traditional corporate harassment prevention training is pretty lackluster and unlikely to end a culture of enabling harassers and dismissing victims’ claims. It’s now standard for employers to offer anti-harassment and discrimination training, but bias training for hiring and performance reviews is less common. This is a serious weakness, for employees who don’t understand their bias don’t know when that bias influences critical business decisions.

A better way

Virtual reality is gaining traction in enterprise for job training, especially for industrial environments. Studies show that people are quicker to understand abstract concepts and retain information longer in immersive environments compared to traditional training methods. Used by professional sports players and manufacturing workers alike, VR can create muscle memory (ex. operating heavy machinery) and simulate an infinite number of real-world customer service scenarios (soft skills training), but can the technology change attitudes?

Stanford researchers have been studying the impact of VR on human behavior and the medium’s ability to inspire empathy. In a recent study, they found that VR is more effective than our imagination for combating inter-generational bias. Because VR requires less cognitive load yet feels real, it encouraged subjects of the study with negative group attitudes to adopt the point of view of the “other.” If VR can affect cognitive behavior at the heart of real social issues, it suggests a profound tool for changing workplace culture.

The first time you can actually walk in someone else’s shoes – real uses cases of VR for anti-harassment and unconscious bias training


In 2016, the NFL turned to Stanford’s Virtual Human Interaction Lab in an effort to confront racism and sexism in the league, which struggles to retain women and minorities in leadership positions. The Lab had been developing scenarios designed to unsettle the user and engender empathy. The NFL wanted to use these scenarios with league staffers and players, to put them in the role of the victim. In one scenario or virtual simulation tested by the NFL, the user’s avatar was that of an African American woman being angrily harassed by a white avatar. When the user would reflexively lift his arms in self-defense, what he saw was his “own” black skin.

Equal Reality

In 2017, Equal Reality gained attention for its VR unconscious bias training. Unconscious bias is the most universal and stifling barrier to women’s progress in the workplace. Examples of unconscious bias towards women are reflected in findings such as:

  • Female employees negotiate as often as men but face pushback when they do
  • Female employees get less access to senior leaders and mentors
  • Female employees ask for feedback as often as men but are less likely to receive it than their male counterparts

Equal Reality develops virtual simulations, in this case workplace scenarios in which users interact, taking on multiple perspectives in order to learn to identify examples of pervasive bias as well as more subtle discriminatory behaviors. In 2018, realizing that paid actors and ordering a bunch of sailors to sit in a classroom and talk about behavior were doing nothing, the Royal Australian Navy adopted Equal Reality’s solution. Wearing a headset and holding two controllers, sailors are able to experience what it’s like to be in a wheelchair, treated differently and excluded from workplace conversation because of one’s disability.

Through My Eyes

In April of this year, BCT Partners and Red Fern Consulting announced a VR program called Through My Eyes, which trains employees to recognize unconscious bias through virtual scenarios. In one simulation, the user is a bystander, observing how bias plays out in different situations. In another, the user is one of the characters in the scene. Users’ choices and reactions in the virtual environment generate data, which is fed back to them and used to customize the training to each individual.

Vantage Point

Two-time survivor Morgan Mercer started the VR corporate training platform Vantage Point, which takes VR beyond simple roleplaying to illustrate the subtleties of sexual misconduct in the workplace. Like a Choose Your Own Adventure book, the user’s response to each situation in Vantage Point changes how the scenario plays out. The scenes involve a lot of grey area and are designed to teach both men and women communal accountability. In one simulation, the user’s talking with four coworkers, one female and three male, about an upcoming conference in Las Vegas. Trying to discuss her presentation and noticeably uncomfortable as the men begin to engage in locker room banter, the woman is suddenly grabbed by her boss who tells her to “pack something fitting.” Depending on how you, a witness, respond, the narrative either escalates or deescalates.

In another simulation of a colleague’s going-away party, a male coworker approaches the new female manager taking over the position. The user must grapple with what’s acceptable and what’s not, what’s a joke and what crosses the line, and when charisma becomes chauvinism. In the end, he or she must make a choice between speaking up or calling HR.

Vantage Point has three training modules: Bystander intervention, identification of sexual harassment, and responding to harassment when it happens to you. Last year, Tala (a fintech startup) and Justworks (the payroll platform) piloted the technology. In addition, Mercer draws on scientific research to develop best practice guidelines for the solution, which she hopes will become the standard for sexual harassment training. Though it’s too soon for any hard statistics, Vantage Point is receiving a lot of interest from investors and Fortune 500 companies alike.

Protecting workers

VR doesn’t tell you how to behave; it places you in the proverbial shoes of another, compelling you to empathize with that person because it feels like whatever is happening is happening to you. Doctors today are using VR to better understand the patient experience and improve their bedside manner. Further proof of the technology’s power is its use in PTSD treatment programs and transition programs for soon-to-be-released prisoners. In enterprise, anti-discrimination and harassment training doesn’t have to be a box checked off by HR; with VR, this training might actually end real-world harassment and boost company performance.


Image source: Equal Reality


The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 6th annual EWTS will be held September 17-19, 2019 in Dallas, TX. More details, including agenda and early confirmed speakers, to come on the conference website.

Home on the VRange: Immersive Tech in Residential Real Estate

Today, the U.S. housing market is nearing all-time highs following a long recovery from the 2007-08 global financial crash, which was fueled (in part) by the collapse of the housing market itself. Despite this, the traditional Real Estate market is challenged by a number of contemporary trends. Already enduring digital disruption via websites like Zillow and StreetEasy, the residential real estate sector must adapt, adopting emerging technologies to disrupt itself from within.



In addition to limited space and gentrification – major trends jacking up costs in urban neighborhoods – there is an unprecedented demand for ‘single-dweller’ housing in cities due to more and more young professionals choosing to postpone family life for professional and social pursuits. Startups like WeLive (WeWork) and Dwell offer innovative real estate models that address the anxieties of urban living and new socioeconomic realities. A young professional who would have scrambled to find a roommate on Craigslist (an early real estate disruptor) now seeks affordable, flexible co-living solutions like Ollie’s co-living microsuites and micro-living building in NYC that have a built-in social network and great amenities. The future of cities will be small, smart living spaces.

New Consumer

Whereas baby boomers and Gen Xers desired to settle down, millennials – impacted by high student debt and the high cost of urban living – are largely single dwellers, less inclined to marry and start families. And while older generations saw homeownership as a source of wealth, their younger counterparts are less likely or able to buy housing. Meanwhile in the suburbs outside major cities, there is a glut of McMansions built in the lead up to the 2007 housing bust but a shortage of modest ‘starter homes’ for young families. Millennials currently represent the largest market to buy and rent homes, with Gen Z soon to follow. As real estate customers, these younger generations expect on-demand information, flexibility, market and price transparency, and ease of transactions. They also value green living and perceive properties with high quality visual presentation as higher value. These digital natives are also increasingly willing to make significant purchasing and renting decisions online.

Informational Parity

In the past, high fees for traditional agent/broker services could be justified because consumers depended upon qualified real estate professionals for access to residential listings. Now, the digitally engaged consumer has a wide range of resources for market information, including websites like Zillow, Realtor.com and Trulia and other platforms that have made market analysis available to the public. The average buyer or renter today can perform sophisticated searches and compare listings – a service that was once the exclusive domain of realtors – all for free on his or her own time. Though agents no longer have an informational advantage, their role is not obsolete–their specialized assistance is desirable for negotiating and dealing with inspections, escrow, insurance, co-op boards, etc. With digital competitors firmly entrenched, traditional realtors need to focus on differentiating their services, capitalizing on the fact that although people are digital-first real estate transactions will always be emotionally-driven, human decisions.


Residential real estate has gone through several waves of digital disruption, including the rise of online portals that have come to dominate the real estate search. Online platforms have also helped streamline many purchasing, rental and leasing processes, with some companies now offering fully integrated, end-to-end solutions for buying and selling homes and even generating mortgages. Evidence of the rise of ‘Proptech’ or ‘REtech’ can be seen in newly created CIO positions at real estate firms, while the technological readiness of homes is becoming a key selling point for consumers desiring smart and connected, energy-saving home solutions.

Despite the rise of the Internet and the importance of a home’s digital listing, staging a property – making it attractive to visiting buyers to boost its perceived valuation – remains key to a listing agent’s success in marketing and selling a home. This may include purchasing furniture for an empty space, repainting and refinishing, and/or rearranging items in an existing space, which takes time and money. Though good, old-fashioned yard signs remain a hallmark marketing tool for listing agents, emerging technologies are steadily creeping into residential realty. Common real estate marketing practices like distributing expensive paper brochures, staging properties, and even the construction of model homes will have to be reconsidered as new technologies emerge, offering potentially cheaper and more effective alternatives.


Real estate professionals are finding creative ways to incorporate advancing technologies. The ability to remotely interact with clients, for example, is reducing the need for physical office space and travel, which in turn reduces overhead while permitting wider outreach. Over the last decade, ExP Real Estate has grown into a billion dollar real estate brokerage in North America, all without housing their agents in offices. Instead, they’ve built a sprawling organization that meets for training and strategic planning only in the virtual world. ExP may be an outlier in using tech to eliminate the costs of office space, but it points to the powerful potential for emerging technologies to disrupt the real estate sector.

Robust digital strategies, including the effective application of augmented and virtual reality, will be key to realtors’ success in reaching clients with independent access to market intelligence. A real estate transaction is still a high-stress event; the financial stakes are high for both buyer and seller, but immersive technologies can help facilitate efficient communication among all parties and alleviate what is typically an emotionally-charged process. For smaller real estate organizations and independent brokers, understanding the potential of AR/VR will be just as critical as for larger firms if they are to keep pace with technology and compete.


VR Tours

Today, the Internet is a person’s first stop in the search for housing, but it’s hard to make a listing stand out among hundreds or thousands of similar listings online. Listings with VR tours, however, can effectively showcase a property and help hasten a sale or rental without the need to go to dozens of open houses in the company of a realtor. The immersiveness of VR means users can freely explore a realistic rendering of a property from the comfort of home and make an informed offer. Homeowners anxious to close quickly and fetch the best price can have greater confidence in a listing agent who uses high-quality, interactive VR models to market their property.

Camera companies like Matterport and GeoCV make high-quality virtual mapping fast and accessible, producing virtual scale models of properties that can be toured wearing a VR headset or examined from an overhead ‘dollhouse’ view. Lower-quality VR models you can walk through can even be created from photos taken on a smartphone. Of course, for consumers who don’t own a VR headset, VR tours can be enabled for mobile or desktop and real estate agents are also equipping their offices with VR devices. For out-of-state or just very busy homebuyers unable to visit a property due to time or distance, VR allows them to visit and revisit a home from wherever, providing answers to the questions normally fielded by an agent on site. In this way, VR can accelerate real estate transactions.

Augmented Agent

An agent’s commission is a predetermined percentage that doesn’t account for the time it takes to close a scale, which means technological solutions that reduce routine informational queries and travel are worth exploring. In some cases, the agent hosting a property is there only to unlock the door and entertain browsing visitors who may not be serious buyers. Augmented and virtual reality are excellent technological stand-ins for a human agent seeking to maximize productivity.

Innovative rental companies like Tour24 take advantage of facial recognition technology to grant – via mobile app – prospective renters access to apartments without an agent or tenant present. A beacon-activated informational tour unfolds via smartphone as the potential renter moves through the property. Taken a step further, open houses might come with AR smart glasses used to scan QR codes and view heads-up commentary at various points of interest. In this way, an agent could accommodate a prospective client’s schedule, giving them secure access to the home, and customize the tour without having to personally attend. Even classic marketing practices like planting a realtor’s sign can be taken to the next level with AR: Compass Real Estate, for example, has rolled out beacon-enabled signs that flash at passersby.  Similar AR-enabled signs scanned via smartphone could provide property information and statistics accompanied by a prompt to contact the agent.

Virtual Model Homes and Virtual Staging

It’s challenging to describe and sell a property that hasn’t yet been built. In most cases, a homebuyer or renter is in the market to purchase a vision for a property, so how that vision is presented is key. AR/VR technologies are powerful tools for bringing a future property to life, enabling tours of properties still under construction and making it possible for potential buyers to visualize spaces that do not yet exist. When marketing a home in progress, the immersiveness and detailed accuracy of a virtual reality model can supplement or entirely replace the usual promotional pamphlets and 2D or physical scale models. Brochures might be AR-enabled, while mixed reality could enable effective on-site tours, helping visitors see the potential of an unfinished, undecorated property and come to a decision before seeing the finished product.

When looking for a home, you have to imagine what it would be like to live in an unfamiliar space. While a good agent is able to anticipate what the client is looking for in a property, much of the decision making process comes down to the initial impression of an open house. Realtors often hire staging companies to bring in furniture and decorate homes before going to market. These companies generally stick to neutral decor, aiming to appeal to the greatest number of interested buyers. A couple with four children, however, seeks very different features in a home than a young bachelor or older couple with grown children. But what if you could provide a personally compelling visual narrative of the same space to individuals with varying tastes and requirements? Of course, you cannot physically rearrange a staged home for every potential buyer but with AR/VR you can help onlookers transcend a property’s current physical state, which might push them to make an offer. Staging can become a personal experience offering an array of design configurations depending on the client.

Imaging solutions from virtual staging startups like RoOomy, which counts Sotheby’s among its clients, overlay furnishings and interior designs into virtual models of empty properties captured via Matterport’s technology. An imaginative agent could put in a virtual jungle-gym or swimming pool, a pool table, built-in bar, home office, etc., customizing the virtual presentation down to the most minute details to be most effective. There are multiple benefits to virtual staging, including money, time and resources saved on temporary furnishings and meetings at the property itself, the ability to stage multiple interior design schemes and the opportunity to cross-market the services of partner businesses like interior designers and furniture manufacturers who might share the development costs of the virtual staging.


Digital platforms have produced an expectation of ease and access that has disrupted most corners of the real estate industry. A trend towards vertical integration of these platforms threatens to further encroach on the markets of traditional realtors. Real estate professionals must evaluate how emerging technologies like AR/VR can help them compete and create an irreplaceable role for themselves.


The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 6th annual EWTS will be held September 17-19, 2019 in Dallas, TX. More details, including agenda and early confirmed speakers, to come on the conference website.

AR/VR Innovation at Nissan, Adidas, ADT and More

Emerging technologies are taking root across industries. Learn how a wide variety of enterprises are applying new technologies in this summary of the most recent use cases of AR/VR and wearables:

Fast and Secure Customer Service via AR

Customer support is a key consideration for companies purchasing expensive, mission-critical equipment. When an urgent repair is needed, inefficient customer support practices can unnecessarily prolong costly disruptions to operations. Swiss machinery manufacturer Bobst understands that continuous improvement of their customer service practices is important to guarantee the integrity of their products and earn customers’ loyalty; which is why the company recently deployed the Helpline Plus AR system. This was intended to boost Bobst’s capacity to respond to customer requests quickly and efficiently, and indeed the solution did improve the performance of Bobst’s help desk technicians.

Augmented reality (AR) gives Bobst’s technical experts the ability to remotely diagnose and remedy a customer’s problem from anywhere in the world. Heads-up AR headsets deliver a two-way video and audio connection over a secure WiFi connection for real-time, visual remote guidance. With the customer wearing an AR headset, a support center-based technician can inspect the machine in question and give easy-to-follow troubleshooting and repair advice and instructions. The ability to observe remotely and instantly prevents mistakes and confusion in issue resolution and limits the amount of downtime for the customer, generating savings for both vendor and customer and multiplying the value of Bobst’s well-trained techs. Already boasting a strong customer support system, Bobst now sees AR-enabled, see-what-I-see communication as a powerful tool for service support that merits a worldwide rollout.

Continual Innovation on the Assembly Line at Boeing

Building a plane is a massive project. Production efficiency is a top priority, and the scale and complexity of the plane manufacturing process amplifies the consequences of a tiny mistake. Boeing has teams that evaluate every minutiae of the production process for possible optimization. For example, the company is set for a company-wide deployment of a Bluetooth-enabled smart wrench that measures the torque applied to a nut. The introduction of self-driving work platforms on the assembly line will be a significant innovation to cut time lost on the assembly line, promising to improve monthly production of 787 Dreamliners from 12 to 14. That one piece of technology could produce such a boost in output is remarkable, but the impact it achieves is only possible in combination with other innovations that have been regularly introduced by Boeing. Workers on platforms can now work seamlessly without the interruption of using a forklift to move the scaffolding of a workstation, which saves time and reduces the risk of accidents. Many of Boeing assembly line workers wear industrial exoskeletons to greatly reduce the strain of repetitive movements, in addition to using connected tools like the ‘smart’ wrench and AR glasses for workflow support.

Boeing’s innovative solutions are created by multidisciplinary teams of Boeing engineers who operate in small ‘innovation cells’ within factories where they use virtual reality to test their ideas. A recent breakthrough in one cell led to the implementation of a 3D-printed, curved ruler that reduced the time needed to execute specialized inspection tasks within a plane’s cabin by over five hours. The greater precision achieved by leveraging emerging technologies to transform existing processes can also reduce the need for some inspections overall. Industry leaders like Boeing continue to astound with their almost continuous development of innovative and effective applications for emerging technologies on some of the most sophisticated production lines in the world.

Hear more about Boeing’s use of emerging tech from Christopher Reid, Brian Laughlin, and Connie Miller at EWTS 2019 this September in Dallas.

VR Helps Adidas Corporate Teams Find Their Stride

In today’s corporate world, departmental silos create gaps in communications, leaving key decision makers to operate with limited information. Visibility and accessibility across departments and disciplines is critical to effective communication and collaboration in an organization, a problem Adidas identified in its own process for bringing new shoes to market.

Adidas’ answer for getting teams on the same page to deliver a shoe from design stage to a retail environment? Virtual reality. The retailer uses software supplied by The Wild to model products, build virtual marketing campaigns, and showcase new shoe designs. Holding meetings wearing HTC VIVE VR headsets allows cross-departmental decision makers to better communicate ideas and demonstrate designs. VR makes inherently spatial design concepts clearer and provides greater transparency into a project overall, putting stakeholders with varying expertise coming from offices that usually have little contact with one another on the same page and reducing the back and forth that can stifle global collaboration efforts. Having VR models of new designs readily available for scrutiny means that flaws can be identified and remediated before a product enters the costly production phase, ultimately speeding up the delivery of the product to market. In addition, other areas of Adidas’ business can use the shared 3D library to visualize and iterate products and marketing strategies in virtual retail spaces based upon the company’s real stores.

Hear more about Adidas’ use of emerging tech from Brooks Clemens at EWTS 2019.

VR Marketing: ADT’s Alarming Simulation Gets in Customers’ Heads

Safe at home on your own? ADT’s latest marketing campaign, developed in collaboration with Harte Hanks, brings the danger right into your bedroom. For the campaign, ADT shipped makeshift VR headsets to select households. With these, consumers were able to view an immersive YouTube video simulation of a house fire and ADT’s coordinated response with the local fire department. The virtual experience drops you in the middle of a crisis in motion, simulating the disorientation of waking up in a dark smoky room as a fire rages within the home. ADT’s campaign proved accessible, educational and engaging, a powerful emotional trigger to build brand awareness.

Marketing is an excellent space for experimentation and innovation with AR/VR, and campaigns similar to ADT’s can be conducted on a wider scale and at a lower cost in the future once VR headsets become a common household item. Enterprise applications make practicality a priority, but in marketing the incentive is to creatively connect with consumers and make a strong impression by whatever means is most effective. Innovative marketing teams will continue to toy with VR to produce novel, visceral experiences that enable brands to connect with customers.

Haptics for Better Handling

New car designs usually begin with 2D paper models and when a design is selected to advance to production, a 3D clay model is created to get a sense of the design at scale and refine the model. Expensive, inflexible and labor-intensive, clay modeling has been a standard auto industry practice for more than half a century. Now, VR is becoming widely adopted in car design, enabling designers to review interior and exterior details of a 3D vehicle model and identify any necessary changes to be made to the CAD model before a physical prototype is created.

VR, however, can fall short compared to the interactivity of sculpting a clay model; which is why Nissan recently deployed HaptX’s VR gloves. Merging the experience of the virtual world with sensory reality, the gloves deliver haptic feedback to the wearer, creating the sensation of physically shaping a car model with one’s hands. Designers wearing the gloves can feel the contours of the vehicle surface, manipulate console buttons and dials, and even grip the virtual steering wheel and drive the car. Though HaptX’s tech is currently limited (ex. you cannot distinguish textures or feel the subtle actions of gears and switches), Nissan’s use of it marks an important step towards more practical applications of VR.

Compressing the Sales Process

Swedish machinery manufacturer Atlas Copco’s AIRNET line is a range of high-quality piping and compressor equipment sold to provide complete integrated solutions for compressed air infrastructure. Atlas’ global distribution sales team markets the company’s integrated compressed air systems using components from the range of AIRNET products; but selling such a complex system can be slow and ineffective if the client cannot clearly visualize the functional layout of the system or its easy installment, operation and maintenance.

In order to improve the overall sales process and experience, Atlas Copco adopted Eon Reality’s 3D modeling and VR technology solutions. Atlas’ salespeople have been given access to a full virtual range of AIRNET SKUs to present clients with tailored compressed air infrastructure solutions. Using Eon’s tools, salespeople can create and adjust plans according to a client’s wishes without any particular technical expertise. The ability to demonstrate and swap AIRNET components in a virtual model eliminates the need to carry samples (there are over 1,000 AIRNET SKUs!); and complete quotes can be quickly calculated accompanied by a functional simulation of a system and the bill of materials adjusted with each design iteration. Installers and technicians also get access to cloud-based VR installation guidance. Using VR, Atlas Copco’s sales team is able to shorten the sales cycle and better engage clients while assuring superior VR-enhanced follow-up support.

VR for Public Outreach: Clearing the Air About Petrochemical Operations

The towering smokestacks of a chemical plant or refinery can be an ominous sight. Public misconceptions and mistrust pose a serious challenge to companies whose operations often only reach the public consciousness via news of industrial accidents and disasters. This has resulted in an unsympathetic industry image – one of pollution as opposed to cutting-edge tech and critical production – which, in turn, affects recruitment of new generations of talent.

Industry association American Fuel and Petrochemical Manufacturers (AFPM) has pushed companies like Marathon Petroleum and Ineos to improve public relations and boost recruiting efforts by using VR to ‘open’ their plant operations to the general public. A VR tour experience of Marathon’s Galvestone Bay refinery and one of Ineos’ La Porte chemical plant, made for Oculus as well as the more accessible (and cheaper) smartphone-enabled Google Cardboard, aim at demystifying the industry for consumers. Viewers virtually meet Marathan and Ineos employees, the idea being to dispel doubts about oil and gas operations and inspire students to pursue careers in the field. VR permits a higher level of engagement with and outreach to the public than previously possible for the likes of Marathon and Ineos.


The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 6th annual EWTS will be held September 17-19, 2019 in Dallas, TX. More details, including agenda and confirmed speakers, available on the conference website.