Turn Up the Volume: Sound AR and 3D Audio

Why enterprises using augmented and virtual reality need to listen closely to noise in the workplace and the role of sound in immersive experiences


Noise. Or, in more technical terms, audio.

I was recently gifted a pair of the new Apple AirPods. While I’ve had noise cancelling headphones before, the AirPods Pro offer something new: Active noise cancelling and a transparency mode that uses built-in microphones to allow outside sounds in when I want or need them. Now, I’m not an audiophile and this isn’t an Apple ad, but I am very sensitive to sound when I’m working, particularly the sounds of other people in the office. I’m at my most productive while listening to music but as soon as a colleague takes a phone call, my concentration breaks. (Hell is other people, am I right?)

Source: Apple

I’ve been using the AirPods’ noise cancelling feature every day at work, which got me thinking about noise in the workplace and the role of sound in immersive experiences. This is a topic that came up at EWTS 2019, with multiple speakers touching on hearables as well as haptics and voice. As virtual reality headsets improve and major tech companies develop tools that make it easier to create 3D content, the absence of our other senses in XR becomes more and more obvious.

Big, bad noise

My research took me in a number of directions, but let’s start with Millennials in the office. The idea that Millennials have short attention spans is a stereotype; in fact, younger generations are getting more selective about what they pay attention to.1 Today, most people – not just Millennials – use headphones for their own ‘attention management.’ According to Bitkom Research, nearly 50% of people wear headphones to mute their surroundings while 20% do so to focus on work. Millennials, however, are especially concerned about rising decibels and more likely to wear headphones to drown everything out.2 While the government caps safe noise levels around 90 decibels, Cornell University reports that office workers are most comfortable between 48 and 52 dB. To put that in perspective, casual chatting is around 60 decibels.

Why does any of this matter? In the age of digital transformation, organizations are seeking to augment and upskill the workforce. There are now four generations in the workplace, but when it comes to recruiting and training the next generation of workers, the focus is naturally on Millennials and Gen Z. Traditional notions about how people best collaborate and what makes a productive work environment are proving false for younger age groups. Workstyles are changing: AR/VR training and collaboration are on the rise in industry, with numerous studies and use cases showing the benefits of immersive technologies over traditional teaching methods and work tools. Millennials want quiet spaces and employees are bringing their own noise-cancelling earbuds to the office because sound can, it turns out, significantly impact employee performance and satisfaction. And then there’s occupational hearing loss, one of the most common work-related illnesses resulting in over $240 million in workers’ compensation every year.3 So, where are the next-gen acoustic solutions for enterprise?

The Sound of Music Work

In many industries, as you can imagine, employees work in loud environments on a daily basis. OSHA set legal limits on noise exposure in the workplace in the early Eighties, yet the government acknowledges that these aren’t low enough to protect workers from hearing loss. It’s estimated that 22 million workers are exposed to hazardous levels of noise on the job every year. By industry, that’s 46% of manufacturing, 51% of construction, and 61% of mining, oil and gas workers.4 Though employers are required to take steps to protect workers’ hearing, including providing hearing protection and training, investing in low-noise equipment and adjusting work shifts, some employers don’t and 25% of workers just don’t like wearing earplugs or earmuffs.5 The thing about hearing loss is that it’s irreversible, and we now know a lot more about the health effects of noise than we did when exposure limits were first set to 90 dBA over an 8-hour workday. Experts agree that regular exposure to anything over 75 decibels is enough to cause long-term hearing damage. (For reference, most equipment operates above 100 dB and all of the following are above 75: Bulldozer, crane, hydraulic breaker, electric drill, jet takeoff on a tarmac, riveting machine, and newspaper press. Also, keep in mind that a small increase in decibels represents a significant rise in noise and potential damage.) As for health effects, noise has been linked to heart disease, high blood pressure, sleep disturbance, and even adverse birth outcomes.6

The truth is we don’t fully understand the impact of noise on workers. Scientists are realizing that sounds much lower than 75 (and over shorter periods of time) contribute to long-term hearing loss and something called ‘hidden hearing loss’—a permanent reduction in neural response (loud noise kills ear nerve endings). I won’t go into the ecological devastation wreaked by human-created noises, but for work safety purposes, noise-induced hearing loss also reduces one’s ability to hear high frequency sounds like alarms and understand speech, impairing communication. If you’re wondering why a worker would neglect to wear hearing protection, current devices (foam and silicone earplugs, earmuffs, etc.) leave a lot to be desired. One of their major shortcomings is the need to remove them in order to communicate with others. That’s where active noise cancelling comes in.

The ear is where it’s at

Hearables are progressing rapidly thanks to the convergence of low-power components, smaller AI processors, advanced microphone arrays, and more. Touch and gesture controls are improving, and many hearables now have integrated voice assistants, can do real-time language translation, are controlled via smartphone, etc. Sensear, for one, makes intrinsically safe in-ear earplugs and headsets for industrial users. In addition to noise cancelling, the company’s SENS technology isolates and enhances speech, so the wearer remains aware of her surroundings while still being protected from harmful background noise. Every ear is unique and each person hears differently, so it makes sense that hearables are becoming more customizable, but there is plenty of room for innovation, increased connectivity and additional capabilities, bringing this familiar wearable form factor into offices and factories alike. Besides intelligent noise control, earbuds/plugs could be equipped with any number of sensors that track location, movement, biometrics and even emotion. (The ear is actually a good spot for monitoring pulse and electrical brain activity.) Imagine a hearable that cancels out certain sounds to reduce stress or an in-ear device allowing you to interact with artificial colleagues like AGVs and collaborative robots—the hearable category has great potential beyond noise cancelling, active or not.

Audio AR

The “last piece is audio,” as Raytheon’s Kendall Loomis said at EWTS 2019. At this year’s EWTS, Boeing, Con Edison and ExxonMobil all brought up hearing protection, but ‘audio wearables’ can also augment workers to help them work faster and smarter, with sound and/or auditory information changing based on the user’s context. At UPS, for example, audio wearables tell workers what to do, no training required. This is essentially audio AR, overlaying auditory information into the user’s environment, and it can be as simple as enabling a worker to hear a pre-recorded guide – one of your expert employees talking through a process – while on the job. AR isn’t purely a visual play; audio can be modified in real time based on tracked data, location or task and deliver safety, machine and other job-critical information from sensors or an ERP right into the worker’s ear.

Not surprisingly, Bose is taking a sound-first approach to AR with Bose AR-enabled products and apps that provide tailored audio content into users’ ears as opposed to a small screen. Bose Labs exhibited at EWTS 2019 to see how its sound technology might assist industrial workers and to meet enterprise AR hardware makers. See, immersive technologies are so effective for training because they’re experiential. Sound is a major part of the human experience, and the combination of visual and audio AR could be twice as powerful for the workforce.

3D Audio

Audio AR adds an extra layer of information, but what about sound in the virtual world? If VR is going to completely replace traditional enterprise training (and probably design) programs in the future, then it had better be as close to the real thing as possible. Since sound helps us orient ourselves in space (auditory spatial awareness), truly realistic VR must be able to simulate sound localization.*

*Occupational hearing loss negatively impacts spatial hearing, as well.

In case you’re wondering, surround sound is not the same as 3D audio because the listener and sound sources are, for the most part, in fixed positions. 3D audio, on the other hand, is full-sphere surround sound, moving along with the listener in the physical environment. In a 3D soundscape, the user would be able to understand where he is relative to the noises around him so he could, for instance, sense something happening behind him. For binaural recording, sound engineers typically use a dummy head, placing microphones in the dummy’s ears, but the listener is again in a fixed position. Another approach adds speakers at different heights. The complexity lies in the fact that your aural experience changes as you move through the world. Ambisonic mixing for virtual spaces requires enabling sound to adjust according to where the user and sound sources are in the VR world. The user(s)’ position relative to other sound sources and the sources themselves (objects) are in motion. Today’s VR audio tech uses specialized recording systems and algorithms to mimic lifelike sound. There are some ambisonic microphones on the market, as well as custom rigs of omni- and bi-directional mics, and a few audio companies have released encoding formats supporting 3D audio, but lifelike audio VR remains a few years away.

Beyond Sound

We’ve been using audio wearables for decades: There was Panasonic’s AM radio designed to be worn as a bracelet in the early Seventies and the Sony Walkman, which turned 40 in July. It’s now a habit to carry sound around with us, on our bodies, and yet sound is a secondary consideration in AR/VR. From augmenting one’s hearing against occupational hearing loss to enabling a technician in training to hear exactly how a healthy engine should sound and simulating all the sounds of assembly before a worker ever hits the plant floor, getting sound “right” in extended reality will only amplify the technology’s effectiveness in enterprise. And who knows, the addition of touch and smell might eliminate traditional job training forevermore.


  1. Prezi
  2. Oxford Economics
  3. CDC
  4. NIOSH
  5. Hearableworld
  6. The New Yorker

Main image source: Auditoryprotection.com


Enterprise Wearable Technology Summit 2020

The 7th Enterprise Wearable Technology Summit (EWTS) is October 20-22, 2020 in San Diego! Join hundreds of Fortune 1000s to try out the latest in wearable tech, including AR/VR/MR, body-worn devices, and even exoskeletons, and to learn how today’s biggest companies are profiting from and scaling the technology. More details, including program and the largest expo of industrial AR/VR/Wearable tech) to come on the EWTS 2020 website.

Using XR to See Underground: Interview with Arcadis’ Allison Yanites

Before EWTS 2019 went down last month, I had the chance to interview one of the event’s thought leaders. Check out our interview with Allison Yanites, Immersive Technology Lead at Arcadis, the global natural and built asset consulting firm.

Emily (BrainXchange): To begin, could you provide our readers with a little background on yourself and what you do at Arcadis? Also, when did you first encounter AR/VR?

Allison: I am the Immersive Technology Lead at Arcadis North America. I am currently working to find different ways that augmented reality, virtual reality and other related technologies can improve customer experience, health and safety, and quality of life. Before this role at Arcadis, I worked as a geologist on environmental remediation projects: understanding subsurface conditions such as layers of soil and rock, if any groundwater or soil contamination is present, and if impacts are static or still moving below ground.A big piece of that work was creating 3D visualizations of subsurface data to help our clients and stakeholders better understand the full picture of what is happening below ground and help determine the next steps to clean up any contamination.

A few years ago, our team developed a mixed reality visualization of one of these environmental sites, where our stakeholders could see and interact with a holographic image of the groundwater contamination of the site. That was my first real experience with immersive technology as an industry application, and it was a gamechanger for me. Working with our digital team at Arcadis, I wanted to look beyond just holographic visualizations of environmental models and see how much we can do with AR/VR across all of the types of programs Arcadis is involved with, how we can use immersive and 360 technology for design, engineering, project and management services across all markets.

E: So, you really start at the beginning of a project, with touring a site? 

A: It depends. On some projects, a lot of data has already been collected, such as sites that have been monitored for decades; on other projects we are collecting data in an area for the first time. Either way, we are taking a large collection of data and trying to understand the complex geological and chemical patterns underground, and ultimately, determine the best ways to remove chemical impacts at the site.

E: Can you speak a little more about Arcadis’ business and its customers (who they are)?

A: Arcadis is a natural and built asset consulting firm. We work in partnership with our clients to deliver sustainable outcomes throughout the lifecycle of their natural and built assets. We have 27,000 people in over 70 countries with expertise in design, consulting, engineering, project and management services, and we work with a wide range of markets and industries, including oil and gas, manufacturing, transportation, infrastructure and municipal water.

At Arcadis, our mission is to improve quality of life in the communities we serve. Whether that is by ensuring the environmental health of communities or reducing the amount of time people spend in traffic, we develop our solutions with our client’s end-users in mind. To design the most impactful solutions, Arcadis has committed to digitally transforming our business at every level of our organization. That includes training our staff on new digital capabilities, using cutting-edge technologies and then applying our subject matter expertise. We then use these tools and skills to better understand, and address, our client’s needs.

E: How is Arcadis using augmented and virtual reality? What pain points does AR/VR address?

A: Arcadis is using augmented and virtual reality in different ways across a variety of projects. Our immersive technology practice includes on-site visualization with different types of headsets, 360-degree photos, video and virtual tours, and remote assistance with AR capabilities. Generally, immersive technology is addressing four main pain points. The first is increased safety — for example, we can share access to difficult-to-reach sites with 360-degree imagery or livestream video, and bring additional staff or clients to the site virtually. Ultimately, we must keep people safe while still collecting as much data as possible. The second is speed of decision making for example, using AR to overlay a 3D design over an active construction site helps quickly identify any differences between the plan and the current project status. The third is cost reduction — for example, we can now virtually connect project teams and clients to remote sites. This reduces travel and helps reduce the costs associated with delayed communication or unplanned rework. And the fourth is enhancing stakeholder communication and collaboration — for example, virtual 360-degree site tours and remote assistance are virtually bringing staff, clients, and stakeholders to the site where they can participate in discussions about site conditions or questions on certain issues. AR/VR visualizations also greatly improve our communication of design plans or subsurface data visualization.

E: I imagine there are a lot of new demands for the built environment, especially with climate change. Do you think that AR/VR are unleashing more creativity, enabling designers to do things they’ve never done before?

A: Absolutely. There is a lot of power in using AR/VR to understand how the environment is changing, and how to prepare communities and businesses accordingly. AR and VR visualizations can communicate designs to stakeholders that address sustainability needs or flood and storm resilience. AR/VR technology also gives designers the flexibility to share their designs with stakeholders more clearly and effectively, with a greater level of detail, than ever before. When you use AR/VR to see first-hand how a flood level impacts homes and businesses, it takes on a greater urgency than it may have before. We are also using AR/VR technology for training situations, and many training scenarios are relevant to our changing environment and being prepared for the future.

E: How have customers received the technology? Was it easy for them to use? Have any clients adopted AR/VR for themselves?

A: We have had success applying immersive technology services, and it’s exciting to see this technology expand and scale in our industry. At the same time, we are continually working to apply the right technologies for the right projects, and find new ways to solve problems for clients. These technologies are a moving target; they evolve so quickly. It seems like every few weeks there is a new product, software/hardware capability, or integration that opens new opportunities for how AR/VR can be applied. In addition to gaining traction and adoption with the services and capabilities we have established, we are constantly evaluating how we can solve emerging client challenges with new and immersive technology.

E: What was piloting like? Was there an actual test period and were there any major challenges to using the technology at Arcadis?

A: Several years ago, we started with a few different pilots and tested different AR glasses, VR headsets, 360-degree cameras and various software programs to develop content. Each of the solutions or services that we have explored has been rigorously tested, and if appropriate is then developed internally or in partnership with our clients. We are still doing pilots because the space is evolving. With one particular workflow there might be an update in either the hardware or the software that offers a new opportunity, so we’ll go in and test that. The pilots are really tied to the problems we can solve and the solutions we can bring to our clients, working with them to customize what we do with these different tools.

E: Where does the content come from?

A: So far, we have developed everything on our own. We use plugins and software to create content, but the content is coming from our own project locations and 3D designs, like wastewater treatment plant designs, environmental remediation sites or highway infrastructure designs. We already work in those spaces so we have the data sets, which we can use to create the AR/VR visualizations. Through our FieldNow™ program, we have also committed to collecting data 100 percent digitally, which means we can now apply this technology to more projects than ever before.

E: How do you measure the ROI of AR/VR at Arcadis?

A: ROI varies from project to project, but does generally come back to the four KPIs: Increased safety, speed of decision making, cost reduction, and enhanced stakeholder communication and collaboration.

E: How has Arcadis handled the security part of adoption?

A: Arcadis takes data security very seriously. Our group works with our IT department to thoroughly vet each technology against industry security standards. Additionally, our use of each of these technologies is also typically evaluated by our clients to make sure it is compliant with their security protocols. Security is always a leading factor in any new technology we adopt.

E: Are there any applications you’re hoping to test in the future at Arcadis?

A: We are constantly evaluating what we can do to exceed our client’s changing expectations. As new applications and technologies become more accessible, we want to make sure we are equipped to address both traditional and emerging client challenges.

Beyond finding new ways to integrate software platforms, we are starting to leverage the internet of things and wearable technologies more frequently. As a large company that is involved in many different industries, Arcadis uses a lot of different software programs. For each software program (3D design visualization, data analytics, program management system, etc.), we develop unique workflows to create AR/VR and 360 visualizations and/or integrate with a program management system. We are always looking for new software products or software updates that make it easier to integrate AR/VR into our daily routines.

E: With sensors in the environment and wearables, I assume you’re gathering new kinds of information for these models?

A: Absolutely. We are using sensor data, which provides real-time results that can be fed into our data analytics platforms and visualized in different ways. We are also excited about platforms that can house data and be updated in a seamless way, so a whole project team across the globe has access to one central data set.

E: What are your greatest hopes for this technology?

A: As immersive technology becomes more mainstream and awareness keeps spreading about its value for industry, it is exciting to see how many ways immersive technology is adopted and applied. This technology is still so new, I am excited to follow its evolution and see what will be possible in five, 10 and even 30 years. My hope is that as the technology starts to deliver more and more value to businesses, we also see increasingly creative ways to improve quality of life in communities around the world.

All of the News Out of EWTS 2019

The 2019 Enterprise Wearable Technology Summit (EWTS) took place last week in Dallas. While it was the largest EWTS yet – with over 1,000 attendees and 60+ exhibitors – the show still managed to retain its characteristic intimacy. 2019 was also the most diverse year on the EWTS expo floor, which showcased a variety of hardware and software including AR/VR devices, exoskeletons, haptic gloves, and training platforms. Past and longtime EWTS attendees caught up, first timers were exposed to the top industrial AR/VR and wearable solutions, and everyone took away something to fuel the next year of innovation at work.

While this event has always been about the end user, the immersive/wearable tech market has also grown here. This year, a number of exhibitors chose to announce new partnerships and launch products at EWTS. Here are all of the announcements that came out during the show:


ThirdEye launched its X2 mixed reality glasses, a lightweight enterprise AR headset retailing at $1950. What sets the X2 apart is its light weight of only 9.8 ounces (the smallest on the market). Designed for small, mid-sized and large-scale companies, the X2 is strictly for enterprise. (For comparison, HoloLens 2 costs $3500 and the Magic Leap One $2295.) Learn more


Iristick announced that its Iristick.Z1 smart safety glasses are now compatible with iOS smartphones (in addition to Android), which is great news for early enterprise adopters with a strict iOS company policy. Learn more


Qualcomm announced a new initiative designed to help XR companies accelerate the development of business solutions. The Qualcomm XR Enterprise Program – part of the company’s broader Qualcomm Advantage Network – connects XR headsets based on the Qualcomm Snapdragon XR Platform together with enterprise solution providers in a broad array of industries. Learn more


The three companies announced and showcased a new, fully wireless PC concept called Boundless XR, a precursor to Boundless XR over 5G—an untethered walking VR experience that will enable users to configure and explore a range of Cadillac vehicles in high definition thanks to ZeroLight, without the need for external sensors.

Using a Pico protoype headset, Qualcomm replicated the high bandwidth and low latency of 5G at its booth by rendering on a PC and streaming directly via a local 60-GHz wireless connection. The 5G version will move from local hardware to 5G Mobile Edge Compute (MEC). Learn more


The company launched a new look, new site and new product at EWTS 2019. “EWTS is where the seed of the idea was planted by innovative companies saying, ‘This is the solution we need’. It’s the perfect place to share our new SaaS product with the XR industry. Learn more


Jujotech introduced and demonstrated Fusion Inspect, “the first market solution for smart headsets with rich media customizable reports and fully integrated with remote assist (Fusion Remote).” Fusion Inspect provides hands-free inspection and reporting to improve AEC and Telco productivity. Learn more


TeamViewer announced the integration of TeamViewer Pilot remote connectivity platform with RealWear, Vuzix and Epson smart glasses. Learn more


If Magic Leap had to choose one event to establish itself in enterprise, EWTS was the right choice. At the same time, Magic Leap announced Concepts, “free apps with limited functionality meant to garner feedback, experimentation and support from the broader [ML] community.” One new concept already released is the “Wall Street Journal Stock Data Concept” from The Dow Jones Innovation Lab. Learn more


The company announced the new BeBop Sensors Forte Data Glove Enterprise Edition, a comfortable, one-size-fits-all, wireless VR/AR haptic glove built for business. The glove provides real-time haptic feedback allowing users to “feel” textures and surfaces and move around digital objects. BeBop also won a U.S. Air Force contract. Learn more


Epson’s see-what-I-see remote assistance solution Moverio Assist is now commercially available at a low cost. Using the Moverio smart glasses, wearers can view high-quality instructions, photos, PDFs and videos while communicating with remote personnel in real time. Learn more


*Image source: J&J Studio

Education, not Automation, is the Problem: 21st-century Job Training with XR

Many people fear the day when drones, robots and self-driving cars will replace human workers. This is understandable and it’s not only delivery drivers who have reason to fear—computer algorithms (artificial intelligence) could potentially replace entire departments of human employees. Though many industries and job professions are experiencing existential crises, the future will not be jobless. It will, however, be quite different, with new jobs and more employee turnover (in pace with advancing technologies) requiring humans to be able to quickly and effectively train and retrain for new roles.

Today’s workforce is aging. Simultaneously, current workers and new members of the workforce (millennials and soon Gen Z) are being forced to compete against cheaper labor around the world, against technology and automation, etc. in a rapidly changing global (and gig) economy. There isn’t a lack of jobs; in fact, as certain jobs are being automated, other positions requiring higher (and often more technological or advanced) skills are being created. Today, millions of jobs requiring a trained human touch are going unfilled because there aren’t enough workers equipped with the skills to fill them. The problem isn’t automation; it’s education. What we have is a training problem and the solution is extended reality. This is why some of the world’s biggest employers are going virtual to build the workforce they need now:


Walmart is the largest company in the world by revenue, with 3,500 Walmart Supercenters in the U.S. alone and 2.2 million employees worldwide. How does a company of Walmart’s size and global presence maintain quality training across its stores? Virtual Reality.

Walmart isn’t just testing VR for training. With the help of STRIVR, the retail giant has been implementing VR training, purchasing 17,000 Oculus Go headsets in 2018 to roll out a nationwide virtual reality (soft skills) training program. 10,000 Walmart employees are using the VR platform already, and it doesn’t seem like adoption is slowing down. By putting trainees into simulations of real-life situations, Walmart has been able to reduce the travel costs associated with traditional training facilities. The company is even applying VR to determine promotions, incorporating the tech into the interview process to help identify employees with managerial potential.

Hilton Hotels

In addition to using virtual reality to allow guests to preview rooms, the hospitality giant is turning to immersive technology to modernize training for its upper-level employees. Last year, Hilton worked with a third party (SweetRush) to film a 360-degree VR experience in a full-service Hilton Hotel. The simulation allowed corporate staff to experience a day in the life of a Hilton employee, the idea being to help them understand the physically challenging and complex tasks of day-to-day hotel operations. Instead of flying executives from across Hilton’s 14 brands (Hilton operates in 106 countries and territories), executives can put on a VR headset and experience what it’s like to clean a hotel room like a real member of the housekeeping staff.

In this case, Hilton wanted executives to get a sense of the complex demands made of the company’s staff at its 5,300 properties and to encourage empathy. Role playing is a key component of hospitality training; relying on a network of trainers to deliver bespoke training around the world, however, is expensive and doesn’t ensure consistent training across the Hilton brand. The company is planning to expand its use of VR training, including piloting a conflict resolution program designed to improve service recovery.

Preparing for danger

JLG Industries describes itself as a manufacturer of “mobile aerial work platforms.” If that doesn’t make your heart race, then I guess you don’t suffer from Acrophobia. JLG designs, builds and sells equipment, including electric boom and scissor lifts used on construction sites worldwide. From a quick Google search, it’s evident that poor training on these machines or a mistake in assembly can lead to a lawsuit, so it’s not surprising that JLG is using VR to train operators of its boom lifts.

How does one safely learn to operate vehicles from platforms up to 185 feet off the ground and on giant arms? With JLG’s networked training program built by ForgeFx Simulations, multiple trainees in multiple locations can operate virtual boom lifts in the same virtual construction site without ever leaving the ground. JLG customers could also benefit from the program, which is much safer than training on a real machine and more efficient to boot.

In a similar use case, United Rentals, the world’s largest equipment rental company, said it would begin offering VR simulators this year through its United Academy. United began testing VR for training new hires at the end of 2016. Instead of lectures and pictures of construction sites, VR was able to transport them to the job site. Standing on the edge of the virtual job site, employees were given two minutes to observe the environment and identify any missing equipment. The user then had to make his or her pitch to the construction boss (an avatar). In these early tests, United was able to shorten its typically week-long training program by half.

More recently, it was reported that United Rentals is offering VR training to help its customers teach their own employees how to operate scissor lifts and other machines.

Six Flags

Six Flags, a global leader in the attraction industry, employs nearly 30,000 seasonal workers to move millions of people through its parks during the busiest times of the year. That means every year, Six Flags must train tens of thousands of people to work in admissions, retail, ride operations, and more. In 2015, Six Flags began seeking alternatives to traditional instructor-led training, which wasn’t adequately preparing temporary hires. Fearing that PowerPoint presentations and low-tech audio/visual approaches weren’t adding to the organization, Six Flags injected tablet technology into training at two of its properties. The learning module moderated the flow of training by discovery, introducing videos, a simulated tour experience, safety quizzes, and more using gamification. In post-pilot surveys, 89% of participants believed the tablets improved their understanding of the training material and 91% agreed that Six Flags needs more tech in its learning and development programs.

Further transitioning from instructor-led to more engaged training, Six Flags has since added AR and VR to the mix, creating a virtual park tour with guest hot spots that trainees can experience without physically leaving the classroom. You can imagine the VR tour is useful at Six Flags properties in colder climates or under expansion. The ultimate goal for the theme park giant is to increase engagement and improve retention by creating a more realistic job preview process in onboarding.

According to a new study by BAE Systems, 47% of young people (aged 16-24) believe their future job doesn’t exist yet. BAE also predicted that the top jobs in 2040 will involve virtual reality, artificial intelligence, and robotics. Are students learning the skills that will be in demand 20 years from now? Only 18% feel confident that they have those skills, while 74% believe they aren’t getting enough information about the kinds of jobs that will be available in the future.

Some of the world’s biggest companies are heavily investing in augmented and virtual reality training solutions. Not all have purchased headsets in the tens of thousands like Walmart, but companies that want to maintain a competitive edge are looking to immersive technologies. AR/VR – the ability to create any number of lifelike simulations without real danger or risk, to simulate any working environment or situation anytime, anywhere – is a gamechanger not just for the organizations trying to bridge today’s skills gap but especially for young people anticipating the jobs of tomorrow that don’t yet exist.


Image source: VR Scout

Building the Future of Exoskeletons: Meet Dr. William G. Billotte

He’s working with BMW, Boeing and others to introduce standards and raising the bar in the exoskeleton market: Meet Dr. William G. Billotte, Physical Scientist at the National Institute of Standards and Technology (NIST) and Vice Chairman of the ASTM F48 Exoskeleton and Exosuit committee. I got to interview Dr. Billotte on the importance of standards and fundamental work of NIST. Read our conversation: (Full bio at the end)

Emily: To begin, could you provide a little background on yourself and NIST? When did you first start working on exoskeleton tech?

W: My background is I’m an engineer and a biologist with a bachelors and masters degree in engineering and a PhD in biology, and I’ve been working in the biology/engineering area for probably 17 or so years, providing scientific and technical guidance to different federal agencies, first responders and other organizations. I’ve worked in a number of different areas: biological detection, first responder equipment, critical infrastructure protection, etc. I’ve been in the exoskeleton area since around 2014. I work for a federal organization, the National Institute of Standards and Technology (NIST), part of the Department of Commerce (see here for some history).  I’ve been here since 2009 as a physical scientist.

I worked for the Department of Defense before I came to NIST, and I was a consultant here in the D.C. area before that as a bioscience advisor, ever since 2002.

E: What is ASTM? How did it form and who is involved?

W: ASTM is an international standards development organization and there are a bunch of standards development organizations. It’s a non-profit. NIST works with a number of similar organizations across the world. ASTM is where we set up the F48 Committee on Exoskeletons and Exosuits around 2017. We talked to a number of different standards development organizations and it seemed like the best fit was with ASTM. I’m the Vice-chairman on the F48 Committee but I’m not an employee for ASTM; it’s a volunteer-type thing. Everybody has their day job and does standards also.

Here is a link to a recent paper describing the development of ASTM F4

For your reference, there is legislation that encourages federal agencies to use and participate in voluntary consensus standards [National Technology Transfer and Advancement Act (NTTAA), Public Law 104‐113.]

E: Are the companies actually building and using exoskeletons a part of F48?

W: We’ve got around 130 members. Anyone can join. We have meetings about twice a year face-to-face and then meetings all year long sort of how we are now. We’re trying to get standards out there that meet the needs of industry. That’s how standards work in the U.S.; they come from the ground up. If you look on your computer, the USB port is just one example of the many standards that people use every day and rely upon. Similarly, we want standards so that exoskeletons can be tested and manufacturers can easily demonstrate to their users that they’re safe and reliable. We want some guidance out there, like we just passed one standard for labeling exoskeletons. How do you put labels on these and give the user or buyer some information? That was a standard to help the manufacturers label their products and provide the right info for the user—very basic stuff right now. We’re still at the beginning stages of getting standards out there for exoskeletons. It’s an exciting time because there’s a lot to do.

E: What is the exoskeleton market like today? 

W: Bobby Marinov, who is also on the F48 committee and runs a website called the Exoskeleton Report, has written a number of articles about this on his site, in Forbes and other places. He has a good snapshot of the market, which is this: In the past two years, you’ve gone from 20 or so exoskeletons being used in the automotive industry in a few places in the U.S. to almost 1,000 worldwide, and that’s just the auto industry and mainly on the assembly line. Chris Reid’s team at Boeing has done a tremendous amount of work in this area, too; Chris is actually the leader of one of our subcommittees and he’s very involved in the ergonomics community. We’re having a face-to-face meeting at the Human Factors and Ergonomics Society in Seattle in the Fall.

E: What counts as an exoskeleton?

W: You’ve gotten to the hard question here. We struggled for at least two years, even before the committee was set up, we started working on how to define the term exoskeleton and how is an exoskeleton any different than a smartwatch or smart clothing. Why is a smartwatch not an exoskeleton? It augments you, gives you different capabilities, you wear it…We had lots of discussions like that. When I see one, I know what it is but how do we define it, and that’s how we got to a definition: A wearable device that augments you physically through mechanical interaction with the body (The ASTM standard definition of an exoskeleton is “wearable device that augments, enables, assists, and/or enhances physical activity through mechanical interaction with the body.”)

We’re not trying to exclude anything. For example, there is an exoskeleton in the consumer market that helps you to ski, but there aren’t a lot of products in the consumer space (that’s the only one that I’m aware of). And we don’t use the word partial; we just say it’s an exoskeleton that just happens to be for the upper body like those that help for overhead work. Because we’re not thinking about it as a giant Iron Man suit. There’s another one, a glove that assists you in grabbing—that’s an exoskeleton.

E: What are the top 3 industrial sectors where exoskeletons stand to have the greatest impact?

W: The big three groups are industrial, medical and military. I think these are three areas where exoskeletons are going to move forward the fastest. From what I’ve seen so far, there has been a big drive in the manufacturing sector like automotive, airplane manufacturing, those types of environments. There are some possibilities in the construction industry, but it hasn’t gone as far as we’ve seen in automotive. Another great possibility is the agricultural sector. Think of anything that involves hard physical labor, a task where you have to lift something, or where there is an awkward static posture; those give you a lot of opportunities. Really, the value of exoskeletons comes down to economics: Work-related injuries, musculoskeletal disorders, overexertion—these cost billions of dollars every year. It’s really easy to justify, which is why big and small companies are looking at this. It keeps workers safe and on the job, reduces the risk of injuries, and workers can do higher quality work for a longer period of time. That’s the potential. Do we have concrete evidence for every exoskeleton? No, we need to do a lot more studies, especially longitudinal studies, but there are enough studies out there than you can see the potential.

E: Where are you right now with exoskeleton standards and why are standards so important?

W: Standards are so important to organizations and countries because they help shape a marketplace so that you can have reliable products, safe products, and the ability to sell in a fair-trade type situation on a worldwide scale.

E: Are there any studies to back up the value of exoskeletons in industrial workplaces? How do you test the devices? 

W: NIST is a metrology institute. We do research on how to measure things and help set the measures used by everyone in the U.S. We compare those measures to other institutes around the world. NIST is developing test methods; and so, yes, we are doing some testing but we’re not testing the exoskeletons to test the exoskeletons per se; we’re doing testing with the exoskeletons to figure out how we can test all of them. We’ve gotten a few exoskeletons and developed some load-handling tasks and run a number of test subjects through to test the test method, and that is being documented. That test method will then go into our ASTM F48 committee to get massaged some and at some point it will get voted on and hopefully become a standard.

E: The exoskeletons that companies can buy today haven’t gone through this testing. Is it kind of like the Wild West right now?

W: It’s not exactly the Wild West. There has been a lot of testing, but everyone has done their own testing. That is the power of developing a standard test method because then you can compare devices. Chris Reid at Boeing has tested a lot of exoskeletons, but I can’t take his data and compare it to the data from Ford. I don’t know what tools and metrics they used. That’s why we need a repeatable standard method, so any lab can use that test method and everyone can trust the results. This will lead to a standards based certification process which helps manufacturers show the basic performance and safety of their system and will allow for the end users to not have to inherit the burden of assessing the system other than for company specific applicability.

E: So, the market is kind of regulating itself right now?

W: Well, it’s like any nascent market. The only place that you have regulations right now is the medical exoskeleton market because the FDA in the U.S. regulates all medical products and there are a number medical exoskeletons certified by the FDA that are used mainly in clinics. But it’s different than what you see on an automotive line in that usually the operator of the exoskeleton isn’t the person wearing the device; it’s the nurse or therapist. Think about someone learning how to walk again after a stroke. With exoskeletons, you can give patients “higher doses” of walking in a session with a therapist, speeding up the recovery process.

E: Would the ASTM work directly with the regulatory bodies in different industries?

W: We’re hoping the standards that we develop through F48 will be referenced by regulatory bodies, even the FDA. There may not be any regulation in the industrial market.

Here is a link to the NIOSH Center for Occupational Robotics Research.  They look at exoskeletons also and their research would feed in to any industrial focused standards or regulations on exoskeletons.

E: One of my “pet areas of research” is women in the workforce. Do exoskeletons have the potential to enable more women to work in industrial sectors and is there any testing being done on the female body, which is very different from the male body (height, breasts/hips, even spinal cords)?

W: In the testing we’re doing right now at NIST, we’re using men and women. But we don’t see any exoskeletons out there that can make someone stronger than they are right now. If the job requirement is to lift 100 pounds and you can only lift 25 pounds; the devices I am familiar with won’t lift the weight for you. An exoskeleton would help the person to lift the weight more safely and with more repetition. Some may advertise about giving you additional weightlifting capability but as far as the testing I have seen there isn’t anything that can augment your strength like that. But that’s not really the issue. It’s fitting. We’ve been dealing with this issue for a long time, especially in law enforcement. Body armor was developed for a male physique and slightly modified for females and it doesn’t work very well. We’ve been working for years trying to fix that. I hope the exoskeleton community designs for females from the beginning; we’re not going to design a male-fitting exoskeleton and then slightly modify it for females. There will be exoskeletons that fit better for men and ones designed for the female body and even ones that can be easily modified for any wearer.

E: What do you hope to accomplish in 2020 and when do you think exoskeletons will become standard in industrial environments?

W: I think exoskeletons are well on their way to becoming common in the workplace. Seeing how it’s rolling out in the manufacturing sector, aerospace, automotive, etc. I think they will be even more common in 2020. I’m sort of biased but I want to see more standards so that everyone can have an increased sense of reliability and safety with these exoskeletons. Standards will also help stimulate the market.



Dr. William Billotte currently serves as a physical scientist at the National Institute of Standards and Technology (NIST).  In that position, he helps industrial, military, medical, and public safety communities with their national and homeland security standards and technology needs.  Current activities include serving as a principal scientific advisor to Army for exoskeleton standards and technology issues and serving as the vice chairman of the ASTM F48 Exoskeleton and Exosuit committee.  Prior to joining NIST, Dr. Billotte was a CBRNE scientist for the Naval Information Warfare Systems Command (NAVWAR).  For NAVWAR, he managed programs to test, evaluate, acquire and share information on CBRNE detection and responder technologies. This included supporting the National Geospatial-Intelligence Agency’s New Campus East construction, the FEMA CEDAP (Commercial Equipment Direct Assistance Program), the FEMA Responder Knowledge Base (RKB) and the DHS SAVER program.  Prior to joining NAVWAR, Dr. Billotte served as a bioscience advisor for Booz Allen Hamilton where he supported DoD, DARPA, the Intelligence Community, and DHS in the biotechnology, chemical/biological defense and responder technology areas.

Dr. Billotte holds a Ph.D. in Biology from the University of Dayton, a Master of Science in Engineering from Wright State University, and a Bachelor of Mechanical Engineering from The Georgia Institute of Technology.


*Image source: NBC News

Workplace implantables–Yes, we’re going there.

Are workplace implantables a future reality? Implantables are sometimes mentioned as a category of wearable technology, but is a future in which technology becomes more integrated with our biology, in which we voluntarily have technology embedded beneath our skin at work, possible? Some think widespread human microchipping is inevitable; others believe it would mark the end of personal freedom, and still others refer to it as “the eugenics of our time.” If it does happen, Europe will already be ahead of the game.

Today, more than 4,000 people in Sweden have consented to having microchips injected under their skin, and millions more are expected to follow suit as the country trends towards going cashless. In addition to enabling Swedes to pay for things with the swipe of a hand, the technology can be used to ride the train (Sweden’s national railways are entirely biochip-friendly), unlock doors, monitor one’s health, and even enter many Nordic Wellness gyms. At the forefront of the microchip movement are two European firms: BioTeq in the United Kingdom and the Swedish Biohax International founded in 2013. Both firms make a pretty basic chip that’s inserted into the flesh between the thumb and forefinger. The chips don’t contain batteries or tiny advanced computers; they’re powered only when an RFID reader pings them for data.

So, what exactly are microchip implants? They’re mainly passive, read-only devices with a small amount of stored information that communicate with readers over a magnetic field using RFID (radio frequency identification). This is the same technology used to track pets and packages, and you probably carry it in your pocket—most mobile phones and credit cards today are equipped with RFID and U.S. passports have been embedded with RFID chips to deter fraud and improve security since 2007. A simple microchip implant, about the size of a grain of rice, might store an ID code that’s processed by a reader to permit or deny access to a secured area.

Chip implants have survived years of science fiction but they’re not brand new. The first implantable RFID chips for humans were patented in the late 90s. Technological advancements have led to the miniaturization of both monitoring devices and power sources, but so far implantables have only been widely discussed in terms of medical applications. Devices like pacemakers, insulin pumps, etc. are well-known; and doctors are exploring connected implantables capable of capturing vital health data from patients and in some cases administering (drug) treatment. This is changing, especially now that Elon Musk has entered the picture with his plan to implant microchips into human brains!

Much of the fear surrounding human chip implants arises from misinformation, pop culture, and paranoia. The biological risks are no worse than those of body piercings and tattoos. In addition, the chips are compatible with MRI machines, undetectable by airport metal detectors, and not difficult to remove. People have been augmenting their bodies since ancient times and wearing pacemakers for decades now. It’s not a huge leap from having this technology on our bodies via phones and contactless cards to putting it under our skin for easier access and greater convenience. Security and privacy concerns are natural. You hear the words “microchip implant” and visions of a dystopian future in which all your movements are traced and bodies can be hacked immediately come to mind. Though such concerns will likely grow as microchips become more sophisticated, today’s smartphones send far more information about you to Google, Apple, and Facebook than current microchips can. Your browser history is a greater threat to your privacy, I assure you.

That’s not to say that microchip implants are 100% secure (at least one researcher has shown they’re vulnerable to malware) or that there aren’t ethical implications and risks we won’t be able to foresee. Security concerns include eavesdropping, disabling and unauthorized tag modification, not to mention employee rights and religious concerns. Though the chips don’t store much information or have their own power source, it would be possible to use the data to learn about a person’s behavior. Depending on what the implants are used for (and if they have GPS tracking), employers could see how often you show up to work, the number (and length) of your breaks, what you buy, etc. On the upside, it’s not possible to lose a microchip implant like you might another form of ID; but on the downside, you can’t claim that the data didn’t come from you. Thankfully, a number of U.S. states have already introduced laws to prevent employers from forcing staff to be chipped.

A brief, recent history of microchip implants in the workplace

A number of human microchip experiments and pilot projects have received media coverage in recent years. In 2015, for example, digital startup workspace Epicenter began making Biohax chip implants available to employees in Stockholm. The main benefit seems to be convenience: In addition to unlocking doors, the chips allow Epicenter workers to share digital business cards, buy vending machine snacks, and operate printers with a wave of the hand. Outside the company, the implants can be used at any business with NFC readers, which are becoming more and more common in Sweden.

The first American company to try Biohax’s technology was Three Square Market. At a “chip party” hosted by the Wisconsin company in 2018, over 50 employees volunteered to be implanted. 32M has vending kiosks in over 2,000 break rooms and other locations around the world. Ultimately, the company sees the technology as a future payment and identification option in its markets; and it could enable self-service at convenience stores and fitness centers. Today, the tech firm is using the microchips as a perk for employees—a multipurpose key, ID and credit card allowing them to open doors, buy snacks, log into devices, use office equipment, and store health information. Apparently, the company’s also working on a more advanced microchip that would be powered by body heat, equipped with voice recognition, and more.

According to its founder, Biohax is  in talks with legal and financial companies who want to use its technology and has been approached by investors from all over the world; while some financial and engineering firms have reportedly had BioTeq’s chips implanted in staff. There are also isolated cases of tech enthusiasts and self-professed biohackers who have adopted chip implants for convenience or just to embrace new tech. The appeal of implantable RFID and NFC implants comes down to convenience and minimal risk of loss. While the most popular application seems to be replacing physical keys, access cards and passwords for easy entry and increased security, other uses include identification and payment. Chips can also be programmed to suit a business’ unique needs:

Unlock your smartphone, start your car, arrive at your office building and enter the parking garage, pay for your morning coffee, log into the computer at your desk, use the copy machine, share your business card with a potential partner or customer, store your certifications and qualifications, access a high-security area, turn on a forklift, even store emergency health informationall seamlessly, without friction, by having one tiny device implanted between your thumb and index finger.

Would you volunteer for that level of convenience, for an easier and more secure way of opening doors and logging into devices?

Are microchip implants the future, another node in the connected workplace that happens to be beneath the skin? The number of people experimenting with the technology is growing. (You can buy a self-inject RFID chip kit online from Dangerous Things. Warning: It’s not government-approved.) Artist Anthony Antonellis implanted a chip in his hand to store and transfer artworks to his smartphone; and Grindhouse Wetware, a Pittsburg-based open-source biohacking startup, was at one point pursuing powered implants, or “subdermal devices in the body for non-medical purposes.” (Think about a body temperature monitor that controls a Bluetooth thermostat.) And then there’s Elon Musk: Musk co-founded Neuralink in 2016 to create a brain-computer interface. This week, he announced plans to use implanted microchips to connect the human brain to computers. Neuralink sees its technology being used to cure medical conditions like Parkinson’s, to let an amputee control a prosthetic, or to help someone hear, speak or see again. Having already tested its technology on monkeys, Neuralink hopes to begin human testing by 2020. Musk, however, sees a high-bandwidth brain interface as a way for humans to merge with Artificial Intelligence (or be left behind).

So, to chip or not to chip?

For enterprises who do want to experiment or ultimately adopt, here are some suggested precautions:

  • Make it optional: Implants should not be a part of any human resources policy or employment contract. It should be a choice, with the option to remove the chip and destroy its data history at any time.
  • Make sure it really feels optional: Assure there is no pressure to adopt and those who decline a chip implant don’t experience any disadvantage. Offer the same functionality perhaps in a wearable wristband option as 32M has done.
  • Make sure none of the information stored or collected is more than could be found on a smartphone.
  • Focus on controlled environments: Ex. An employee cafeteria. This makes everyday transactions in the workplace easier while reducing the chip’s usefulness to a hacker
  • Use a second security factor: Ex. Combine a cryptographic proof with a biometric option like a fingerprint or retinal scan. Add another layer of security with a Personal Identification Number (PIN) or facial recognition.
  • If the technology ever becomes standard or even required in enterprise, there need to be appropriate exemptions for religious, moral and other beliefs, individual health issues, etc.
  • Keep data protection laws in mind. Consider any information that might be collected or inferred from the data such as access info, patterns of use, etc.

Microchip implants remain a cool experiment on both sides of the Atlantic, but there is no overwhelming need or demand for the technology in the workplace right now. That doesn’t mean implantable technology won’t become socially accepted or shake up a few industries (and the human brain) in the future.


Photo credit: https://www.paymentssource.com/news/chip-and-skin-implantable-rfid-gives-payments-its-matrix-moment

Wearables in Risk Management: Interview with AIG’s Ron Bellows

I got to sit down and talk with Ron Bellows, Risk Strategist at AIG. What resulted is a fascinating- but long (it’s worth it) – read and a wealth of information. Ron will be speaking at EWTS 2019.

E: To begin, could you provide our readers with a little background on yourself and what you did at AIG? Also, when did you first encounter wearable technology?

R: I’ve been a risk management consultant with multiple insurance companies since 1980. I started playing with wearables probably as early as 1985/86. You may remember Cybermatics: Surface EMG measuring technology was connected to cyber gym equipment for rehab and prevention, so that when someone was working out – in the sports or medical world – you could actually see what the muscles were doing with the surface electromyography. It’s kind of like an EKG. Your heart is a muscle; surface EMG looks at other muscles.

Around ’86, I was working with a physical therapist doing studies on sports medicine and rehabilitation that might have application in the industrial environment. Many workers’ compensation injuries are expensive strain and sprain injuries impacting the musculoskeletal system. Biosensors, from a rehab standpoint, help us manage treatment for someone who has had a musculoskeletal injury. It began in the sports world and medicine, and around 2000 it started to become more pervasive in the industrial environment.

If you think about an athlete who over-trains, the same thing can happen in the industrial world.  Biosensors can measure posture, force, repetition, etc.; and be used to look at someone in the workplace from a preventative standpoint as well as on a pre-hiring/screening basis (i.e., can you handle the physical requirements of the job?) If you took a traditional physical, you might pass, but could you go in and work on a factory floor or warehouse for 8-10 hours a day, 6 days a week? Biosensors to better evaluate somebody before they go on the job to help assess their ability. Second value would be to evaluate somebody in the job to document the exposure they face due to fatigue, endurance, force, repetition and posture—the things that generally lead to ergonomic/ bio mechanic injuries. If you can detail that exposure prior to injury you can do a better job with prevention, training and hiring. However, if somebody does get hurt you can use those same biosensors to help assess exactly where and how badly they are injured, the best treatment options, and if they truly are ok to go back to work again. Those are the three main areas where wearables fit into the industrial arena and workers’ compensation (WC).

E: What exactly do you do at AIG?

R: I’ve consulted with large multinational customers to help them find solutions to their risk management issues. Often, they were most interested in workers’ comp risk because it tends to drive loss frequency and severity, impacts the workforce and absenteeism, and reduces efficiency and profitability. Workers tend to be 30-50% of a company’s operating expense, so if you can reduce injuries you can increase efficiency, profitability, etc. Today with the shortage of workers that we see, a lot of companies are working at a 20% absenteeism rate. Imagine what happens when you can’t find enough people to man the tasks in a factory. If you also have extensive injuries that put people out of work or on restrictive duty, it’s even more difficult to run the business. Making sure people can work safely and come back to the job every day is very important to risk managers. I also help risk managers with issues like fleet, liability, supply chain, business continuity, and disaster recovery—anything that keeps them up at night.

E: You just mentioned a bunch of pain points like the shortage of workers. What are the challenges and pain points for AIG’s clients that are driving interest in wearable technologies?

R: There are really two things: One is traditional safety, making sure we document exposure properly so that we can prevent injuries and do better training. It’s not just job hazard analysis but also the workers’ comp system itself, which is very difficult to manage as the venues are all different and every state has different rules. If we can document exposure, we can better manage an individual pre and post-loss. Many times, we see that older, high tenure workers are driving losses. We’re seeing the average age of workers going up, especially in manufacturing, warehousing, trucking, etc. where you have extensive injuries to the shoulder and back. Those injuries are the most difficult to diagnose, treat, and return to work. If you’re older and you get hurt, it may take you weeks to get back to where you were pre-loss. Our average workforce is in the 40- to 50-year range, so when they have an injury it’s impacted by comorbidity – hypertension, diabetes, obesity – making it more difficult for them to get back to pre-injury status.

Second, many companies today are looking at exoskeletons or other interventions to reduce exposure. When you put an intervention in place you don’t know for sure how much of an impact it’s having on the individual, because everyone is different. With biosensors, we can measure the impact of different interventions and see which ones are having the greatest impact on the worker based on their exposure. For example, I would use different exoskeletons for the upper extremities versus the back, versus the legs. So, it depends on what kind of difficulties workers are having in the workplace. For example, if I have to do static standing all day on a conveyor line, exoskeletons may not be valuable, but the biosensors can tell me what’s going on with the static stress on the lower extremities, which impacts the entire body. I can then look for alternatives like automatic massage therapy, continuous stretching, compression sleeves, to improve endurance and reduce fatigue where the exoskeletons don’t work.

E: What kinds of wearable technologies are you looking at for risk mitigation? Have any solutions made it past the pilot phase to rollout?

R: There are a lot. The biosensor marketplace has exploded in the last several years. We can use biosensors like we’ve talked about from a musculoskeletal standpoint and that’s where most of the impact is seen. But you can also use biosensors to look at an environment: A construction worker going into a pit that may be lacking oxygen can use a biosensor attached to an iPhone that sends a safety signal. You can use a posture monitor for the back like I did with the Visiting Nurse Association. Nurses visiting patients by themselves can be attacked, chased by dogs, fall down stairs, etc. Having an inclinometer or GPS monitor can send an automatic ‘man down’ signal if they’re horizontal. If they can’t push a panic button, their supervisor and local authorities can be alerted to the fact that something is wrong. That’s just one example. Biosensors in chemical plants can look at oxygen-deficient environments and exposure to chemicals and send an alert right away to the individual or supervisor. So, if you’re working remotely in a plant and there’s an ammonia tank with a small leak, the biosensor can alert you to very low levels before you’re overcome. There are so many different ways to use biosensors to alert you to exposure before it creates injury.

E: In most cases are you looking for over-the-counter, out-of-the-box solutions or bespoke devices? Where is the software being made?

R: I review what’s in the market and what’s in development. I try to stay abreast of what’s available so that I can help clients make the best and most informed decisions about how to reduce exposure. There are always several intervention options that could have an impact, so I usually demo and then pilot test the options that fit the particular exposures but also their organization structure and culture. So, I’m always looking to kick the tires on everything around the market.

E: I imagine biosensors come in every form factor at this point. Is it one sensor per device or are you testing multiple metrics?

R: Let’s take posture monitoring as an example, which is huge in workers’ comp because 30-50% of a company’s losses are from strains. Everyone wants to work on musculoskeletal disorders, which also happen to be the most expensive loss type. Inclinometers which measure posture are great because force, repetition and posture are the lead drivers of strain and sprain injuries. You can do heavier work in the power zone between your shoulders and hips, but outside of neutral posture a task becomes more egregious to your body.

Many companies are doing posture monitoring; some are focusing on the upper extremities, some on the low back. Several biosensor companies have produced very good software programs to go along with the inclinometers, showing not only when someone is out of neutral posture but also how many times a day that person is out of neutral posture, for how long, at which tasks, or what times of day, etc. Some biosensors give an automatic response to the employee (like a buzz). That can be good or bad. If I can’t change my posture because the task is set up so that I have to bend a certain way, the buzzer is going to be continuous and become really annoying. That’s where I would take the data to management and operations and say: Here’s Joe and Mike doing the same job but Mike can’t handle the postures. Why? Because he’s a little older and can’t bend as well at the knees. So, posture monitoring without the dashboard is not as effective. The better the dashboard, the better data we have and the more opportunity we have to provide valuable interventions to the physical task.

E: Can the intervention involve changing the way the task is done?

R: Yes. In fact, we can even link biosensors through a software program to a camera, so that as a person moves, we can see both the physical video and an overlay of what’s going on with his or her posture and force over time. While seeing the person do their task in space and time, we’re capturing their force and posture. That becomes really powerful. We can do that over time, creating a dashboard for different tasks, and then give an employer a prioritized list of the most egregious tasks, where high force and high repetition are most likely to generate a musculoskeletal disorder. So, biosensors with a dashboard and video overlay are very powerful in exposure documentation.

E: Can you talk about some of your recent biometrics and exoskeleton projects?

R: Well, anybody familiar with meat processing knows that it’s a very high endurance, high repetition task impacting the upper extremities and back. It’s static stress on the legs, leaning, twisting and bending at the waist, and moving your arms to process meat. Every part of the body is impacted; the repetition is so intense that you’re moving a carcass every 2 seconds. You’re constantly moving, standing in one place doing the same motion over and over, and you’re usually working a 10-hour shift, 6 days a week. Operations, safety and HR know it’s a difficult task but to change the process is very expensive. You might have to move the conveyor circling the entire plant or slow it down, which operations won’t like. Or, you’re going to have to build adjustable stanchions for people to stand up on. Oftentimes in fixed manufacturing plants, it’s very difficult to change the physical process, so we look at other interventions. The biosensors give us data on where the most difficult task/positions are and where management can spend their nickels to make the best impact. You can give them engineering solutions but if they don’t have the money for re-engineering there are alternative solutions like endurance and fatigue management or job rotation, or even just ongoing stretching throughout the day. You mitigate the exposure if you can’t eliminate it. We look for engineering solutions first, but older plants especially have a hard time putting those automation or engineering changes in place.

E: How are you measuring the ROI of the different solutions you’re implementing? What are the KPIs you’re looking for?

R: Primarily, I look at two things when it’s workers’ comp-related: Loss frequency rate: The number of injury accidents per hundred employees (for example, how many strains and sprains we have per task before and after a solution is implemented) and average cost of claim: How does that cost change after the solution is implemented? We try to reduce both frequency and severity of loss.

Here’s a good example: One 24-hour plant of 400 employees had 50 visits to the nurse everyday looking for splints, gauze wraps, and other assistance. You know that the more times people are going to the nurse, the greater the likelihood you’ll have a claim eventually. We implemented endurance / fatigue solutions and then looked at the number of individuals visiting the nurse and in some tasks the number dropped by 80%. That’s telling because it takes a while for the claims numbers to mature enough to tell you statistically significant results. If I have a short change over time, is it just that introducing the solution made everyone more aware? 18 months is about where you have to be to really see a material change in losses. So, we look at other metrics like online perception and symptom surveys. I’ve used therapy to reduce endurance and fatigue injuries and after each session, we give a quick survey asking how the person felt before and after the fatigue management program. We can then see if we’re going down the right road and match up the results to the loss analysis in the future.

E: RFID devices, body-worn (biometric tracking) wearables, and exoskeletons—which category is most mature and deployable today?

R: Posture monitors. The inclinometers and GPS are the most robust and have some of the best software. RFID is good but you have to understand what the exposure is and what end result you’re trying to get to. RFID chips are really good in environments like construction, where it’s noisy, dark and dusty and vision and hearing are impaired. RFID chips give me another sense to avoid exposure. It can also be used for equipment or where people are working very remotely, to see where somebody is working in a plant and where they’ve been. But posture monitors are probably the most robust in terms of software because, again, everyone’s trying to mitigate the strain and sprain injuries. Industrial hygiene (IH) exposure doesn’t have the same frequency of loss as strains and sprains and has been controlled very well over the last 20 years; it’s gotten a lot of attention and there are so many good programs in place.

E: Is ergonomics slightly newer?

R: Ergonomics has been developing since the mid-80s, but it’s interesting that we haven’t found a silver bullet solution, so we’ve done a lot of training. Office ergonomics got a lot of attention. ‘Ergonomic’ became a buzz word and a marketing ploy, and now a lot of equipment is considered ‘ergonomic.’ For example, you can buy a snow shovel that’s “ergonomic”, but the actual exposure to the individual hasn’t really changed. Carpal tunnel syndrome was huge in the late 90s and early 2000s, then the Mayo Clinic and other studies said that the aging workforce is driving CTS more than typing. Today in the workers’ comp arena, an individual’s physical condition can be as much a factor in injury development as the workplace exposure. The comorbidity or illness can make a simple injury so much more difficult to diagnose and treat and this is why wellness and ergonomics need to be considered together. Wearables are helping us communicate exposure to the operations managers who usually hold the intervention purse strings. Ergonomists haven’t done a great job of this in the past, but the biosensors give us data on an individual or task basis that is very telling for operations, human resources and safety teams.

E: How have employees received wearables? What has been the feedback? Are there privacy concerns and how are you dealing with that?

R: A lot of the biosensors are applied directly to the skin and managers are very skeptical or skittish about that. So, in looking at which wearable is going to work for a company you have to consider the unions, female workers, people that don’t speak English etc. You have to think about having an interpreter, if someone will have an allergy to the spray or adhesive used to attach the biosensor… What if you have a lot of hair on your back? Part of my focus is always communicating these considerations to the risk manager: Given the exposure model they face and the loss profile they have, which tasks are driving the losses, what’s the exposure model for the people doing those tasks, and what are the right biosensors to use to fit their organization’s culture.

E: Are they more receptive if the sensor is in a bracelet?

R: You get better, deeper data especially from a force standpoint if you can attach something to the skin. If you can’t you have to use a halter monitor around the chest or a belt-worn device, something on the biceps if upper extremities are the issue, a bracelet on the arm etc. That’s why it’s important to know the loss profile and exposure model for the risk before adopting a wearable product–what tasks are driving loss and what options the company is willing to consider for solutions.

E: What is your hope for the future as for how wearables are developing? What’s a big development you’d like to see?

R: Right now, biosensors are really good at looking at exposure, giving us a dashboard and helping us come up with solution options. Of course, you need to know what’s available and understand the organization culture; but we’re not using biosensors to their full effectiveness in the hiring, and screening process or the post loss injury management.  In WC, early objective medical diagnosis is critical to managing loss costs especially with strain and sprain injuries, and biosensors can be a substantial benefit in that area – including developing telemedicine programs.  We’re also not always closing the loop between risk management, safety, HR and operations in terms of exposure understanding and the implementation of interventions. Consider how many workplace tasks are developed with the idea that there will be one, two or three injuries per year in that task? The answer is none, but we accept those types of metrics as part of the cost of production. We’re collecting good loss and exposure data but not integrating safety intervention into process the way we do with quality. Biosensors give me the detailed exposure information I need to express business and human cost and help qualify the rationale for the interventions needed to reduce exposure. If I can provide detailed documentation of exposure, I can communicate better to Risk Management so they can do a better job of funding exposure reduction solutions and provide the insight for stronger diagnosis, treatment and return to work practices if a loss occurs. You’d be amazed how many loss profiles show repeat injuries, which get exponentially more expensive.  Biosensors can therefore have a significant impact in all three areas of the WC exposure control model:  Hiring, Screening and Deployment; Prevention and Training and Post Loss Injury Management…


Photo credit: Lara de Oliveira Corselet dB via photopin (license)

Ron will present a case study at the upcoming 6th Enterprise Wearable Technology Summit on September 19.

2019: The Year of the Big Pivot Towards Enterprise AR/VR

It’s a shame that AR/VR was overhyped in 2018 because in 2019 the technology is a fixture in enterprise.
I’ll be blunt: Augmented, mixed and virtual reality were overhyped in 2018. While 2018 turned out not to be the year of AR/VR; please don’t roll your eyes when I tell you that 2019 is the year at least for enterprise, and of that I have no doubt.
Here are a few of the signs:

  • More than half of the announcements made at AWE USA 2019 (a staple on the AR/VR calendar) were enterprise-related
  • Some of the world’s biggest consumer tech companies are now entering the immersive tech space, primarily eyeing enterprise
  • The top names in consumer VR are also heavily courting the enterprise

Why? Why are AR/VR hardware and software companies pivoting to enterprise? The answer is obvious: Because enterprise is where the money is. Both AR/VR technology providers and the world’s best-known companies (end users) are making/saving big.

If you follow enterprise AR/VR, you’re no doubt familiar with Google (Glass), Microsoft, and PTC (Vuforia). Other longtime players include Atheer, Epson, HPE, LogistiVIEW, ScopeAR, RealWear, ThirdEye, Ubimax, Upskill and Vuzix. Qualcomm, Honeywell, and Toshiba (dynabook) have become fixtures on the scene, as well, and by that I mean regular exhibitors at EWTS, the only event dedicated to enterprise use of immersive and wearable technologies. Newer sponsors include Jujotech, Pico and RE’FLEKT, along with Bose, HTC and Lenovo, joining top enterprise wearable device and industrial exoskeleton makers on the EWTS roster.

Doesn’t Bose make headphones?
Yes, they do. Bose is known as a consumer audio vender, but it also makes Bose Frames, which provided exclusive audio content and set-time notifications to desert-goers at this year’s Coachella music festival. Founded in 1964, Bose is taking an audio-first approach to augmented reality today with Bose AR, not only at concerts or in automobiles but in meeting rooms, too. Audio AR is a natural fit in the Industrial Internet of Things.

The pivot
In April 2019, Oculus introduced the expanded Oculus for Business, an enterprise solution designed to streamline and grow VR in the workplace. The expanded solution adds Oculus Quest to the hardware lineup and provides a suite of tools to help companies reshape the way they do business with VR.
The following month, Lenovo launched an enterprise AR/VR headset, the ThinkReality A6, immediately positioned as a rival to Microsoft’s HoloLens. Articles spoke of Lenovo as “just the latest manufacturer to develop an AR device aimed at enterprise.” On the heels of Lenovo’s first foray into enterprise XR, HTC announced the HTC Vive Focus Plus, a new version of its Vive Pro that will only be made available to enterprise customers. Furthermore, HTC’s Vive X accelerator has been “pouring money” into enterprise VR startups.

The proof is in the toolbox
The digital transformation isn’t here; it’s underway at hundreds of companies, including household names like Ford, UPS, and Walmart.
Every year, enterprises take the stage at EWTS to share how they’re using wearable and immersive technologies. They share their experiences and best practices, their successes and failures, and then they return the following year. These “veteran speakers” are another sign of AR/VR’s secure position in the present and future of work: AGCO, Boeing, DHL, Lockheed Martin, and Porsche come back year after year to update peers from Bayer, BP, Caterpillar, Coca-Cola, Johnson & Johnson, and other Fortune 500 companies on the latest applications for the technology in their operations. EWTS speakers span industries and sectors: Airbus, BMW, Chevron, Colgate-Palmolive, Con Edison, Duke Energy, General Electric, Gensler, jetBlue, John Deere, Lowe’s, Molson Coors, Southwest Airlines, Thyssenkrupp, Toyota, United Technologies, etc. And new faces join every year—This year’s event will welcome AIG, Amazon, American Airlines, Bridgestone, Exelon, Holiday Inn, Philip Morris, Sanofi, Six Flags, and more to the stage. It’s a cycle: Attendees become users who become speakers, and the technology continues to advance.

Beyond pilots
Lockheed Martin has been a longtime advocate of AR/VR, benefitting so much from mixed reality that it’s now teaming up with Microsoft to sell mixed reality apps to other businesses in the airline and aviation industry. Rollout is growing at BMW, too: The luxury auto manufacturer is providing all its U.S. dealerships (347 BMW centers and select MINI dealers) with Ubimax Frontline running on RealWear HMT-1 head-mounted devices. Shell is also deploying RealWear’s HMT-1Z1 through Honeywell in 12 countries and 24 operational sites. And last year, Walmart announced it was putting 17,000 VR headsets in its U.S. stores for employee training. These aren’t mere pilots. At AGCO, Boeing, and other large manufacturers augmented reality is a standard workforce tool for a variety of tasks in multiple areas of operation. In the last three months alone, Fortune 500 companies in the news for using AR/VR included Audi (Volkswagen), ExxonMobil, Nissan, and even Farmer’s Insurance. Deloitte estimates that over 150 companies in multiple industries, including 52 of the Fortune 500, are testing or have deployed AR/VR solutions. The 6th annual EWTS is the proof.

Reality check
It helps that the tech is steadily improving, of course. This was the first year that I walked around AWE and was truly amazed by the quality of immersive experiences I tried. So, here’s a reality check: AR/VR is having an impact across business and industry and it’s not going away. It’s not future tech; it’s now. And it’s not just AR/VR glasses and headsets but body-worn wearables, as well, sometimes in conjunction with VR as well as in applications warranting an entire day – the third day of EWTS 2019 – devoted to below-the-neck and safety wearables. We’re talking biometric, environment and location tracking, employee ergonomics, partial and full-body exoskeletons—it’s all here today in the enterprise.


Image source: The Verge

Is Digital Transformation for Men? Female Factors in Wearable Tech Design

In 2015, NASA celebrated over 50 years of spacewalking. Three years later, in March 2018, the agency called off the first all-female spacewalk due to a shortage of smaller-sized spacesuits. The walk-back led to a Twitter storm, with women sharing hundreds of stories of their own ill-fitting work uniforms and oversized ‘standard’ gear; but “It’s not just spacesuits,” one woman tweeted: “It’s public spaces like bathrooms, cars, cockpits, office air conditioning, microwave installation heights, Oculus, military fatigues…an endless list.”

In December, I wrote about the phenomenon of patriarchal coding. A feeling that today’s VR headsets were not designed with women in mind set me on a trail of research that revealed I’m not alone in feeling this way and that the majority of the products and systems we use every day are designed by and for men. This phenomenon affects every aspect of women’s lives – it even endangers our lives – and it’s unintentional for the most part, which makes it all the more frustrating. Sexism is so ingrained in our society that women’s unique needs and biology (like the fact that we have breasts) are excluded from reality, even of the virtual kind.

My main point then was that wearable technologies – the body-worn sensors being integrated into organizations’ EHS efforts, exoskeletons taking a load off workers’ backs, and VR headsets being hailed as the future of job training – exhibit coded patriarchy and risk further alienating the female workforce. Wearables that are replacing or supplementing traditional PPE (personal protective equipment) cannot succumb to the same biased or negligent design as have automobiles, office buildings, etc., for the future economy and growth of the workforce depend upon improving job prospects and working environments for women.

The history of man

Women and the female perspective are largely missing from human and world history (as is often the non-western point of view) and entirely absent in the fundamental research underlying the foundations of modern life, including economics and urban planning. The star of the show is “Reference Man,” a 154-pound Caucasian male aged 25 to 30, who has been taken to represent humanity as a whole when it comes to the design of everything from power tools to the height of a standard shelf. Take medicine: Though women process drugs differently, medications are tested only on men. Cars: For decades, car safety testing has focused on the 50th percentile male. The most common crash-test dummy is taller and heavier than the average woman, with male muscle-mass proportions and a male spinal column. This is how “standard seating position” was determined. Women, however, sit further forward in the driver’s seat and thus are 47% more likely to be seriously injured in a car crash. In 2011, the US began using a female crash-test dummy, though not an anthropometrically correct one. Testing with a pregnant dummy? Forget it.

Beyond product ergonomics

It’s annoying that so many gadgets we use are one-size-fits-men, and it’s dangerous. The world is less safe for women because we haven’t been factored into the design of not only physical products but also the software behind everything. Consider navigation apps, which provide the quickest and shortest routes to a destination, but not the safest; or voice recognition and other AI tech, which is male-biased and also becoming indispensable to how we interact with our devices and how systems make major decisions affecting humanity. Google’s voice recognition software? 70% more likely to accurately recognize male speech. Apple’s Siri? When she launched, she could help a user having a heart attack but didn’t know what “I was raped” means. (Side note: the heart attack symptoms healthcare professionals are taught to identify are actually male symptoms.)

Last year, Amazon had to scrap an experimental recruiting tool that taught itself to prefer male candidates for software development and other technical jobs. How did this happen? Because the computer model was trained to observe patterns in resumes from the previous ten years, most of which were submitted by men since the tech world is notoriously, overwhelmingly male. What’s frightening is that in a 2017 survey by CareerBuilder, over half of U.S. HR managers said they would make artificial intelligence a regular part of HR operations within five years. That means women will have to combat unfair algorithms in addition to unconscious bias in order to advance in the workforce. IBM CEO Ginni Rometty says it’s up to businesses to prepare a new generation of workers for AI-driven changes to the workforce. In a world in which AI will impact – and perhaps determine hiring – for every existing job, the fact that women and minorities are disproportionally left out of the teams behind the AI revolution is tragic.

The data gap at the heart of the workplace 

Occupational research has traditionally focused on male workers in male-dominated industries. Few studies have been done on women’s bodies and job environments, so there is little occupational health and safety data for women. The uniforms in most professions are therefore designed for the average man’s body and the why behind trends like the increasing rate of breast cancer in industry remains unknown. Relying on data from studies done on men may explain why serious injuries in the workplace have gone down for men but are increasing among women workers. This despite that, for the last three years, women have been entering the workforce at more than twice the rate of men. (You do the workers’ comp math, employers.)

When we talk about using wearables for EHS applications, oftentimes we’re speaking about body-worn sensors that can detect biometric and environmental data affecting a worker’s health and safety. The software behind these applications might send an alert to the worker or wearer when a reading reaches a certain threshold, but how is that threshold – the danger zone – determined? Say we’re tracking a worker’s exposure to a particular chemical. Women and men have different immune systems and hormones; women also tend to be smaller, have thinner skin, and have a higher percentage of body fat than men—differences that can influence how chemicals are absorbed in the body. Without female-specific data, the threshold at which a wearable device is set to alert the wearer would likely be higher than the toxin level to which a female worker can be safely exposed, putting women at greater risk of harmful exposure. The problem is two-fold: We don’t have data about exposure in “women’s work” and we’re clueless when it comes to women (increasingly) working in male-dominated industries. At this point, it would take a working generation of women to get any usable data on long-latency work-related diseases like cancer.

No PPE for you

Construction is one of those male-dominated industries in which standard equipment and PPE has been designed around the male body. Though there is little data on injuries to women in construction, a study of union carpenters did find that women have higher rates of wrist and forearm sprains, strains and nerve conditions than their male counterparts. To comply with legal requirements, many employers just buy smaller sizes for their female employees but scaled-down PPE doesn’t account for the characteristics (chests, hips and thighs) of a woman’s body. Moreover, it doesn’t seem cost-effective for employers to meet the order minimum for those sizes when women make up less than 10% of the construction workforce. Giant overalls are one thing, but the straps on a safety harness not fitting around your body? How is a woman supposed to perform at the same level as a man if her clothing and equipment are a hindrance? If oversized gloves reduce her dexterity, a standard wrench is too large for her to grip tightly, or her overly long safety vest snags on a piece of equipment? Already a minority in the sector, women don’t usually complain about ill-fitting PPE. Instead, they make their own modifications (with duct tape, staples, etc.). And it’s not just women; dust and hazard eye masks designed for the Reference Man also put many men of color at a disadvantage.

Of course, it doesn’t have to be this way. A standard-sized bag of cement could be made smaller and lighter so that a woman could easily lift it. Exoskeletons might be a solution, but so is going back to the drawing board: Jane Henry’s SeeHerWork, for example, is an inclusive clothing line for women in fields like construction and engineering, fields with lucrative, equal-pay careers and massive labor shortages—fields that need women.

Designing the workplace

Guess what? Men are the default for office infrastructure, too, from the A/C (women tend to freeze in the workplace, which hurts productivity) to the number of bathrooms and stalls (a single restroom with urinals serves more individuals). According to the Bureau of Labor Statistics, women represent nearly two-thirds of all reported cases of carpal tunnel syndrome, which indicates that workstations are less ergonomic for women. Open office plans are conducive to socializing and breaking down hierarchies, right? No, they actually encourage sexist behavior. A 2018 study documenting the experiences of women in an open office designed by men – lots of glass, identical desks, group spaces – found that the lack of privacy created an environment in which female workers were always watched and judged on their appearance. Designers today are beginning to use virtual reality to design factory layouts and workstations, even assembly processes, but that doesn’t mean they’re factoring in female anatomy or putting headsets on women workers to get their input.

I spoke with Janelle Haines, Human Factors Engineer at John Deere, who uses virtual reality to evaluate the ergonomics of assembly, about her experiences performing evaluations on women workers. Most of the people she gets to put in a VR headset are male; however, there are a few female employees available at times for evaluations. “Fitting the job to the worker hasn’t [always] been a focus. Even in the last fifteen years that I’ve been studying ergonomics, there has been a huge shift in learning to focus on ergonomics. It has become a kind of buzz word…There are some jobs that have been at John Deere for years and years, since we started building combines, that aren’t a great fit for women, but going forward with new designs we’re using VR to make sure the workstations and what we design do work for women.” Ergonomics aren’t a new area of study, but Janelle points out a promising shift in thinking and a deliberateness that’s necessary “going forward.”

The future of work: Uncomfortable = unproductive

Smartphones have become standard work tools in many jobs. Men can use the average smartphone one-handed; women cannot (smaller hands). This kind of oversight cannot be carried into the next wave of mobile: Wearable technology. That women have different muscle mass distribution and vertebrae spacing, lower bone density, shorter legs, smaller wrists, lower centers of mass, etc. matters when it comes to the design and application of wearable devices like partial and full exoskeletons, connected clothing and gear, augmented reality smart glasses, and virtual reality headsets. Early decisions in developing transformative technologies can create a weak foundation for the future of that tech.

Already women are at a disadvantage in VR. As far back as 2012, researchers found that men and women experience virtual reality differently and a growing body of research indicates why. Motion parallax (preferred by men) and shape-from-shading (preferred by women) are two kinds of depth perception. What creates a sense of immersion for men is motion parallax or how objects move relative to you, and this is easier to render or program in VR. For women, it’s shape-from-shading, meaning if a shadow is ‘off’ it will ruin the immersive experience for a woman. As shape-from-shading is more difficult to emulate, most VR tech uses motion parallax. Then there are the poor ergonomics of most VR headsets for women (too heavy, too loose, etc.). Why does this matter? Because VR is being hailed as the future of learning and job training; VR is going to be crucial for filling millions of vacant positions and for upskilling the workforce as automation advances. When one half of the population experiences the technology differently than the other half, that’s an unequalizer, especially when all indications point to people spending more time in VR in coming years.

Stop defaulting to men 

The long legacy of researchers overlooking women – not wanting to pay for double the testing – has looming implications at a time when we’re collecting data from more and more ‘things’ and powerful computers are making important decisions for us. It’s bigger than a spacesuit; we’re making decisions based upon biased, incomplete data, feeding that data into algorithms that can exacerbate gender and other inequalities, create risks among certain populations, and encode prejudices into the future. The answer? First, inject more diversity into the labs and back rooms where the future is being designed and engineered. Second, hire female designers and stop using men as a default for everything!



In writing this article, I drew heavily on the efforts and writings of a number of inspiring women; including Caroline Criado-Perez, author of Invisible Women: Data Bias in a World Designed for Men,” Abby Ferri of the American Society of Safety Professionals, and Rachel Tatman, research fellow in linguistics at the University of Washington.


The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 6th annual EWTS will be held September 17-19, 2019 in Dallas, TX. More details, including agenda and speaker lineup, available on the conference website.

All the Enterprise News Out of AWE USA 2019

One of the major takeaways from the 10th annual AWE last week was that enterprise is where the AR/VR market is growing. It was clear that there are serious – and real – enterprise applications providing ROI today to both large and small companies. AWE USA 2019 also saw a number of launches and updates from enterprise AR/VR solution providers. Catch up on all the enterprise news below:


Atheer announced expanded support for devices that can control and provide input to smart glasses via gestures. The enhanced support for gestures – achieved with advanced machine learning tech – makes it easier to control more types of smart glasses outside of the limited group of smart glasses with dedicated depth sensors and enhances other modes of interaction. Learn more


In addition to being on track to have over one million BoseAR-enabled devices in consumer hands by the end of the year, Bose – an unlikely enterprise player – is building an industrial BoseAR wearable for loud, noisy and distracting work environments. Learn more at EWTS 2019 Sept. 17-19 in Dallas, Texas, where Bose’s Ilissa Bruser is speaking. Bose will exhibit at EWTS.


Jujotech’s latest solution Fusion AR with WorkLogic  provides connected workers on the job with quick access to IoT-enabled machine information and remote expert guidance. WorkLogic, an open API, works within Fusion AR to send digital work instructions and checklists to AR glasses/headsets, tablets and smartphones. Learn more


Lance-AR launched at AWE! The consulting and integration company specializes in AR enablement for the enterprise market. Its Enterprise AR Deployment Services are focused on enabling scaled enterprise deployments that deliver real, near-term value with the AR hardware and software available on the market today. Learn more


LogistiVIEW announced its partnership with Fetch Robotics, which combines the AR company’s Connected Worker Platform with Fetch Robotics’ autonomous robotics solutions. The combo enables robot-assisted processes to achieve a “complexity and scale rivaling traditional fixed automation.” It also costs less and is more flexible than traditional automation. Learn more


Logitech’s VR Ink Pilot Edition – still a 3D-printed prototype – is like an oversized stylus that lets you draw and design in virtual reality. You can trace designs in 3D space or sit at a table and draw on its surface. The harder you press on the button or tip of the stylus, the thicker the line. The tech offers more precision than a game controller and is more natural to use for creators and designers. Logitech says it’s close to a final design. Learn more


Qualcomm’s Snapdragon Smart Viewer reference design debuted last week. Built on the Snapdragon XR1 Platform, Smart Viewer is designed to help speed up product development for AR/VR headsets. It takes advantage of the XR1’s processing power to enhance the content AR/VR headsets can offer to consumers and enterprise, distributing the workload and tapping into the compute power of host devices. Additional features like eye tracking and six degrees of freedom (6DoF) controllers unlock even more immersion. Learn more


The Munich-based company announced that the REFLEKT ONE ecosystem now includes Siemens Teamcenter. Siemens customers and business units can easily source live data from the Siemens PLM system for content creation on the REFLEKT ONE platform. The connection should dramatically increase the speed and accuracy of AR/MR content creation. Learn more


Rokid provided a sneak peek at its next-generation mixed reality glasses called Rokid Vision, which are distinguishable from the Rokid Glass (now ready for mass production) thanks to a dual-screen display and 6DoF technology. The sleek design includes an RGB camera, two depth cameras, and a simultaneous localization and mapping (SLAM) module that offloads complex 6DoF calculations from the mobile CPU. Rokid is tethered, requiring you to connect it to a USB-C device with DisplayPort support. Expect the Rokid Vision SDK to be released in the third quarter of 2019. Learn more

Scope AR

Scope AR made a few announcements at AWE, including a new customer (medical device company Becton Dickinson) and an expansion of its integrated AR platform at Lockheed Martin. The company also launched an upgraded version of its WorkLink platform, including session recording. This addition means users can capture and save live sessions between themselves and an expert (the live remote video support calls and AR annotations) for later reference—a great way to retain and pass on tribal knowledge. Learn more

ThirdEye Gen

The creator of the world’s smallest MR glasses (X2) announced a new Software Partner Generate Program intended to expand its developer community and provide exclusive partnership opportunities to individual developers as well as large AR/MR software companies. Learn more


Ubimax expanded its industry-proven Frontline platform to support HoloLens. The integration of HoloLens 2 into Ubimax Frontline extends the benefits of Ubimax’s software into mixed reality environments, making it easy to enrich existing and new AR workflows with holographic 3D objects. Preview here


Varjo was certainly a crowd favorite at AWE, where the company announced and demoed its new industrial-grade headset. Varjo says XR-1 Developer Edition delivers on its promise of making mixed reality indistinguishable from the real world. The video pass-through headset is capable of producing images with a resolution of more than 4K per eye, making the XR-1 the only device that can seamlessly blend the real and the virtual. Varjo will begin shipping XR-1, which connects via wire to a powerful PC, to developers, designers and researchers in the second half of 2019.

Varjo has also teamed up with Volvo, which uses its tech to test-drive virtual car designs on the road. Check out VentureBeat for more specs and examples of industrial applications for XR-1. In addition, hear from Volvo’s Amanda Clarida at EWTS 2019.


Wikitude now supports all leading wearable technologies, not only standalone devices like HoloLens but also a new spectrum of tethered smart glasses starting with the Epson Moverio BT-35E. This means users can engage with AR content wearing head-mounted devices connected to 5G smartphones. Learn more


The smart eyewear maker revealed that the Vuzix M400 Smart Glasses are now available for purchase at a cost of $1,799 as part of an early adopters program. The device, however, won’t actually ship until September. With a larger memory profile, improved voice recognition/noise cancelling, a new touchpad, built-in GPS, OLED display, and Qualcomm Snapdragon XR1 at its core, M400 promises improved interactivity, power consumption and thermal efficiency. Learn more

Catch Atheer, Bose, Lance-AR, LogistiVIEW, Qualcomm, RE’FLEKT, ThirdEye Gen, and other leading enterprise AR/VR solution providers at EWTS 2019.


The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 6th annual EWTS will be held September 17-19, 2019 in Dallas, TX. More details, including agenda and speaker lineup, available on the conference website.