Using XR to See Underground: Interview with Arcadis’ Allison Yanites

Before EWTS 2019 went down last month, I had the chance to interview one of the event’s thought leaders. Check out our interview with Allison Yanites, Immersive Technology Lead at Arcadis, the global natural and built asset consulting firm.

Emily (BrainXchange): To begin, could you provide our readers with a little background on yourself and what you do at Arcadis? Also, when did you first encounter AR/VR?

Allison: I am the Immersive Technology Lead at Arcadis North America. I am currently working to find different ways that augmented reality, virtual reality and other related technologies can improve customer experience, health and safety, and quality of life. Before this role at Arcadis, I worked as a geologist on environmental remediation projects: understanding subsurface conditions such as layers of soil and rock, if any groundwater or soil contamination is present, and if impacts are static or still moving below ground.A big piece of that work was creating 3D visualizations of subsurface data to help our clients and stakeholders better understand the full picture of what is happening below ground and help determine the next steps to clean up any contamination.

A few years ago, our team developed a mixed reality visualization of one of these environmental sites, where our stakeholders could see and interact with a holographic image of the groundwater contamination of the site. That was my first real experience with immersive technology as an industry application, and it was a gamechanger for me. Working with our digital team at Arcadis, I wanted to look beyond just holographic visualizations of environmental models and see how much we can do with AR/VR across all of the types of programs Arcadis is involved with, how we can use immersive and 360 technology for design, engineering, project and management services across all markets.


E: So, you really start at the beginning of a project, with touring a site? 

A: It depends. On some projects, a lot of data has already been collected, such as sites that have been monitored for decades; on other projects we are collecting data in an area for the first time. Either way, we are taking a large collection of data and trying to understand the complex geological and chemical patterns underground, and ultimately, determine the best ways to remove chemical impacts at the site.


E: Can you speak a little more about Arcadis’ business and its customers (who they are)?

A: Arcadis is a natural and built asset consulting firm. We work in partnership with our clients to deliver sustainable outcomes throughout the lifecycle of their natural and built assets. We have 27,000 people in over 70 countries with expertise in design, consulting, engineering, project and management services, and we work with a wide range of markets and industries, including oil and gas, manufacturing, transportation, infrastructure and municipal water.

At Arcadis, our mission is to improve quality of life in the communities we serve. Whether that is by ensuring the environmental health of communities or reducing the amount of time people spend in traffic, we develop our solutions with our client’s end-users in mind. To design the most impactful solutions, Arcadis has committed to digitally transforming our business at every level of our organization. That includes training our staff on new digital capabilities, using cutting-edge technologies and then applying our subject matter expertise. We then use these tools and skills to better understand, and address, our client’s needs.


E: How is Arcadis using augmented and virtual reality? What pain points does AR/VR address?

A: Arcadis is using augmented and virtual reality in different ways across a variety of projects. Our immersive technology practice includes on-site visualization with different types of headsets, 360-degree photos, video and virtual tours, and remote assistance with AR capabilities. Generally, immersive technology is addressing four main pain points. The first is increased safety — for example, we can share access to difficult-to-reach sites with 360-degree imagery or livestream video, and bring additional staff or clients to the site virtually. Ultimately, we must keep people safe while still collecting as much data as possible. The second is speed of decision making for example, using AR to overlay a 3D design over an active construction site helps quickly identify any differences between the plan and the current project status. The third is cost reduction — for example, we can now virtually connect project teams and clients to remote sites. This reduces travel and helps reduce the costs associated with delayed communication or unplanned rework. And the fourth is enhancing stakeholder communication and collaboration — for example, virtual 360-degree site tours and remote assistance are virtually bringing staff, clients, and stakeholders to the site where they can participate in discussions about site conditions or questions on certain issues. AR/VR visualizations also greatly improve our communication of design plans or subsurface data visualization.


E: I imagine there are a lot of new demands for the built environment, especially with climate change. Do you think that AR/VR are unleashing more creativity, enabling designers to do things they’ve never done before?

A: Absolutely. There is a lot of power in using AR/VR to understand how the environment is changing, and how to prepare communities and businesses accordingly. AR and VR visualizations can communicate designs to stakeholders that address sustainability needs or flood and storm resilience. AR/VR technology also gives designers the flexibility to share their designs with stakeholders more clearly and effectively, with a greater level of detail, than ever before. When you use AR/VR to see first-hand how a flood level impacts homes and businesses, it takes on a greater urgency than it may have before. We are also using AR/VR technology for training situations, and many training scenarios are relevant to our changing environment and being prepared for the future.


E: How have customers received the technology? Was it easy for them to use? Have any clients adopted AR/VR for themselves?

A: We have had success applying immersive technology services, and it’s exciting to see this technology expand and scale in our industry. At the same time, we are continually working to apply the right technologies for the right projects, and find new ways to solve problems for clients. These technologies are a moving target; they evolve so quickly. It seems like every few weeks there is a new product, software/hardware capability, or integration that opens new opportunities for how AR/VR can be applied. In addition to gaining traction and adoption with the services and capabilities we have established, we are constantly evaluating how we can solve emerging client challenges with new and immersive technology.


E: What was piloting like? Was there an actual test period and were there any major challenges to using the technology at Arcadis?

A: Several years ago, we started with a few different pilots and tested different AR glasses, VR headsets, 360-degree cameras and various software programs to develop content. Each of the solutions or services that we have explored has been rigorously tested, and if appropriate is then developed internally or in partnership with our clients. We are still doing pilots because the space is evolving. With one particular workflow there might be an update in either the hardware or the software that offers a new opportunity, so we’ll go in and test that. The pilots are really tied to the problems we can solve and the solutions we can bring to our clients, working with them to customize what we do with these different tools.


E: Where does the content come from?

A: So far, we have developed everything on our own. We use plugins and software to create content, but the content is coming from our own project locations and 3D designs, like wastewater treatment plant designs, environmental remediation sites or highway infrastructure designs. We already work in those spaces so we have the data sets, which we can use to create the AR/VR visualizations. Through our FieldNow™ program, we have also committed to collecting data 100 percent digitally, which means we can now apply this technology to more projects than ever before.


E: How do you measure the ROI of AR/VR at Arcadis?

A: ROI varies from project to project, but does generally come back to the four KPIs: Increased safety, speed of decision making, cost reduction, and enhanced stakeholder communication and collaboration.


E: How has Arcadis handled the security part of adoption?

A: Arcadis takes data security very seriously. Our group works with our IT department to thoroughly vet each technology against industry security standards. Additionally, our use of each of these technologies is also typically evaluated by our clients to make sure it is compliant with their security protocols. Security is always a leading factor in any new technology we adopt.


E: Are there any applications you’re hoping to test in the future at Arcadis?

A: We are constantly evaluating what we can do to exceed our client’s changing expectations. As new applications and technologies become more accessible, we want to make sure we are equipped to address both traditional and emerging client challenges.

Beyond finding new ways to integrate software platforms, we are starting to leverage the internet of things and wearable technologies more frequently. As a large company that is involved in many different industries, Arcadis uses a lot of different software programs. For each software program (3D design visualization, data analytics, program management system, etc.), we develop unique workflows to create AR/VR and 360 visualizations and/or integrate with a program management system. We are always looking for new software products or software updates that make it easier to integrate AR/VR into our daily routines.


E: With sensors in the environment and wearables, I assume you’re gathering new kinds of information for these models?

A: Absolutely. We are using sensor data, which provides real-time results that can be fed into our data analytics platforms and visualized in different ways. We are also excited about platforms that can house data and be updated in a seamless way, so a whole project team across the globe has access to one central data set.


E: What are your greatest hopes for this technology?

A: As immersive technology becomes more mainstream and awareness keeps spreading about its value for industry, it is exciting to see how many ways immersive technology is adopted and applied. This technology is still so new, I am excited to follow its evolution and see what will be possible in five, 10 and even 30 years. My hope is that as the technology starts to deliver more and more value to businesses, we also see increasingly creative ways to improve quality of life in communities around the world.

Education, not Automation, is the Problem: 21st-century Job Training with XR

Many people fear the day when drones, robots and self-driving cars will replace human workers. This is understandable and it’s not only delivery drivers who have reason to fear—computer algorithms (artificial intelligence) could potentially replace entire departments of human employees. Though many industries and job professions are experiencing existential crises, the future will not be jobless. It will, however, be quite different, with new jobs and more employee turnover (in pace with advancing technologies) requiring humans to be able to quickly and effectively train and retrain for new roles.

Today’s workforce is aging. Simultaneously, current workers and new members of the workforce (millennials and soon Gen Z) are being forced to compete against cheaper labor around the world, against technology and automation, etc. in a rapidly changing global (and gig) economy. There isn’t a lack of jobs; in fact, as certain jobs are being automated, other positions requiring higher (and often more technological or advanced) skills are being created. Today, millions of jobs requiring a trained human touch are going unfilled because there aren’t enough workers equipped with the skills to fill them. The problem isn’t automation; it’s education. What we have is a training problem and the solution is extended reality. This is why some of the world’s biggest employers are going virtual to build the workforce they need now:


Walmart

Walmart is the largest company in the world by revenue, with 3,500 Walmart Supercenters in the U.S. alone and 2.2 million employees worldwide. How does a company of Walmart’s size and global presence maintain quality training across its stores? Virtual Reality.

Walmart isn’t just testing VR for training. With the help of STRIVR, the retail giant has been implementing VR training, purchasing 17,000 Oculus Go headsets in 2018 to roll out a nationwide virtual reality (soft skills) training program. 10,000 Walmart employees are using the VR platform already, and it doesn’t seem like adoption is slowing down. By putting trainees into simulations of real-life situations, Walmart has been able to reduce the travel costs associated with traditional training facilities. The company is even applying VR to determine promotions, incorporating the tech into the interview process to help identify employees with managerial potential.


Hilton Hotels

In addition to using virtual reality to allow guests to preview rooms, the hospitality giant is turning to immersive technology to modernize training for its upper-level employees. Last year, Hilton worked with a third party (SweetRush) to film a 360-degree VR experience in a full-service Hilton Hotel. The simulation allowed corporate staff to experience a day in the life of a Hilton employee, the idea being to help them understand the physically challenging and complex tasks of day-to-day hotel operations. Instead of flying executives from across Hilton’s 14 brands (Hilton operates in 106 countries and territories), executives can put on a VR headset and experience what it’s like to clean a hotel room like a real member of the housekeeping staff.

In this case, Hilton wanted executives to get a sense of the complex demands made of the company’s staff at its 5,300 properties and to encourage empathy. Role playing is a key component of hospitality training; relying on a network of trainers to deliver bespoke training around the world, however, is expensive and doesn’t ensure consistent training across the Hilton brand. The company is planning to expand its use of VR training, including piloting a conflict resolution program designed to improve service recovery.


Preparing for danger

JLG Industries describes itself as a manufacturer of “mobile aerial work platforms.” If that doesn’t make your heart race, then I guess you don’t suffer from Acrophobia. JLG designs, builds and sells equipment, including electric boom and scissor lifts used on construction sites worldwide. From a quick Google search, it’s evident that poor training on these machines or a mistake in assembly can lead to a lawsuit, so it’s not surprising that JLG is using VR to train operators of its boom lifts.

How does one safely learn to operate vehicles from platforms up to 185 feet off the ground and on giant arms? With JLG’s networked training program built by ForgeFx Simulations, multiple trainees in multiple locations can operate virtual boom lifts in the same virtual construction site without ever leaving the ground. JLG customers could also benefit from the program, which is much safer than training on a real machine and more efficient to boot.

In a similar use case, United Rentals, the world’s largest equipment rental company, said it would begin offering VR simulators this year through its United Academy. United began testing VR for training new hires at the end of 2016. Instead of lectures and pictures of construction sites, VR was able to transport them to the job site. Standing on the edge of the virtual job site, employees were given two minutes to observe the environment and identify any missing equipment. The user then had to make his or her pitch to the construction boss (an avatar). In these early tests, United was able to shorten its typically week-long training program by half.

More recently, it was reported that United Rentals is offering VR training to help its customers teach their own employees how to operate scissor lifts and other machines.


Six Flags

Six Flags, a global leader in the attraction industry, employs nearly 30,000 seasonal workers to move millions of people through its parks during the busiest times of the year. That means every year, Six Flags must train tens of thousands of people to work in admissions, retail, ride operations, and more. In 2015, Six Flags began seeking alternatives to traditional instructor-led training, which wasn’t adequately preparing temporary hires. Fearing that PowerPoint presentations and low-tech audio/visual approaches weren’t adding to the organization, Six Flags injected tablet technology into training at two of its properties. The learning module moderated the flow of training by discovery, introducing videos, a simulated tour experience, safety quizzes, and more using gamification. In post-pilot surveys, 89% of participants believed the tablets improved their understanding of the training material and 91% agreed that Six Flags needs more tech in its learning and development programs.

Further transitioning from instructor-led to more engaged training, Six Flags has since added AR and VR to the mix, creating a virtual park tour with guest hot spots that trainees can experience without physically leaving the classroom. You can imagine the VR tour is useful at Six Flags properties in colder climates or under expansion. The ultimate goal for the theme park giant is to increase engagement and improve retention by creating a more realistic job preview process in onboarding.


According to a new study by BAE Systems, 47% of young people (aged 16-24) believe their future job doesn’t exist yet. BAE also predicted that the top jobs in 2040 will involve virtual reality, artificial intelligence, and robotics. Are students learning the skills that will be in demand 20 years from now? Only 18% feel confident that they have those skills, while 74% believe they aren’t getting enough information about the kinds of jobs that will be available in the future.

Some of the world’s biggest companies are heavily investing in augmented and virtual reality training solutions. Not all have purchased headsets in the tens of thousands like Walmart, but companies that want to maintain a competitive edge are looking to immersive technologies. AR/VR – the ability to create any number of lifelike simulations without real danger or risk, to simulate any working environment or situation anytime, anywhere – is a gamechanger not just for the organizations trying to bridge today’s skills gap but especially for young people anticipating the jobs of tomorrow that don’t yet exist.

 

Image source: VR Scout

Workplace implantables–Yes, we’re going there.

Are workplace implantables a future reality? Implantables are sometimes mentioned as a category of wearable technology, but is a future in which technology becomes more integrated with our biology, in which we voluntarily have technology embedded beneath our skin at work, possible? Some think widespread human microchipping is inevitable; others believe it would mark the end of personal freedom, and still others refer to it as “the eugenics of our time.” If it does happen, Europe will already be ahead of the game.

Today, more than 4,000 people in Sweden have consented to having microchips injected under their skin, and millions more are expected to follow suit as the country trends towards going cashless. In addition to enabling Swedes to pay for things with the swipe of a hand, the technology can be used to ride the train (Sweden’s national railways are entirely biochip-friendly), unlock doors, monitor one’s health, and even enter many Nordic Wellness gyms. At the forefront of the microchip movement are two European firms: BioTeq in the United Kingdom and the Swedish Biohax International founded in 2013. Both firms make a pretty basic chip that’s inserted into the flesh between the thumb and forefinger. The chips don’t contain batteries or tiny advanced computers; they’re powered only when an RFID reader pings them for data.

So, what exactly are microchip implants? They’re mainly passive, read-only devices with a small amount of stored information that communicate with readers over a magnetic field using RFID (radio frequency identification). This is the same technology used to track pets and packages, and you probably carry it in your pocket—most mobile phones and credit cards today are equipped with RFID and U.S. passports have been embedded with RFID chips to deter fraud and improve security since 2007. A simple microchip implant, about the size of a grain of rice, might store an ID code that’s processed by a reader to permit or deny access to a secured area.

Chip implants have survived years of science fiction but they’re not brand new. The first implantable RFID chips for humans were patented in the late 90s. Technological advancements have led to the miniaturization of both monitoring devices and power sources, but so far implantables have only been widely discussed in terms of medical applications. Devices like pacemakers, insulin pumps, etc. are well-known; and doctors are exploring connected implantables capable of capturing vital health data from patients and in some cases administering (drug) treatment. This is changing, especially now that Elon Musk has entered the picture with his plan to implant microchips into human brains!

Much of the fear surrounding human chip implants arises from misinformation, pop culture, and paranoia. The biological risks are no worse than those of body piercings and tattoos. In addition, the chips are compatible with MRI machines, undetectable by airport metal detectors, and not difficult to remove. People have been augmenting their bodies since ancient times and wearing pacemakers for decades now. It’s not a huge leap from having this technology on our bodies via phones and contactless cards to putting it under our skin for easier access and greater convenience. Security and privacy concerns are natural. You hear the words “microchip implant” and visions of a dystopian future in which all your movements are traced and bodies can be hacked immediately come to mind. Though such concerns will likely grow as microchips become more sophisticated, today’s smartphones send far more information about you to Google, Apple, and Facebook than current microchips can. Your browser history is a greater threat to your privacy, I assure you.

That’s not to say that microchip implants are 100% secure (at least one researcher has shown they’re vulnerable to malware) or that there aren’t ethical implications and risks we won’t be able to foresee. Security concerns include eavesdropping, disabling and unauthorized tag modification, not to mention employee rights and religious concerns. Though the chips don’t store much information or have their own power source, it would be possible to use the data to learn about a person’s behavior. Depending on what the implants are used for (and if they have GPS tracking), employers could see how often you show up to work, the number (and length) of your breaks, what you buy, etc. On the upside, it’s not possible to lose a microchip implant like you might another form of ID; but on the downside, you can’t claim that the data didn’t come from you. Thankfully, a number of U.S. states have already introduced laws to prevent employers from forcing staff to be chipped.


A brief, recent history of microchip implants in the workplace

A number of human microchip experiments and pilot projects have received media coverage in recent years. In 2015, for example, digital startup workspace Epicenter began making Biohax chip implants available to employees in Stockholm. The main benefit seems to be convenience: In addition to unlocking doors, the chips allow Epicenter workers to share digital business cards, buy vending machine snacks, and operate printers with a wave of the hand. Outside the company, the implants can be used at any business with NFC readers, which are becoming more and more common in Sweden.

The first American company to try Biohax’s technology was Three Square Market. At a “chip party” hosted by the Wisconsin company in 2018, over 50 employees volunteered to be implanted. 32M has vending kiosks in over 2,000 break rooms and other locations around the world. Ultimately, the company sees the technology as a future payment and identification option in its markets; and it could enable self-service at convenience stores and fitness centers. Today, the tech firm is using the microchips as a perk for employees—a multipurpose key, ID and credit card allowing them to open doors, buy snacks, log into devices, use office equipment, and store health information. Apparently, the company’s also working on a more advanced microchip that would be powered by body heat, equipped with voice recognition, and more.

According to its founder, Biohax is  in talks with legal and financial companies who want to use its technology and has been approached by investors from all over the world; while some financial and engineering firms have reportedly had BioTeq’s chips implanted in staff. There are also isolated cases of tech enthusiasts and self-professed biohackers who have adopted chip implants for convenience or just to embrace new tech. The appeal of implantable RFID and NFC implants comes down to convenience and minimal risk of loss. While the most popular application seems to be replacing physical keys, access cards and passwords for easy entry and increased security, other uses include identification and payment. Chips can also be programmed to suit a business’ unique needs:

Unlock your smartphone, start your car, arrive at your office building and enter the parking garage, pay for your morning coffee, log into the computer at your desk, use the copy machine, share your business card with a potential partner or customer, store your certifications and qualifications, access a high-security area, turn on a forklift, even store emergency health informationall seamlessly, without friction, by having one tiny device implanted between your thumb and index finger.

Would you volunteer for that level of convenience, for an easier and more secure way of opening doors and logging into devices?


Are microchip implants the future, another node in the connected workplace that happens to be beneath the skin? The number of people experimenting with the technology is growing. (You can buy a self-inject RFID chip kit online from Dangerous Things. Warning: It’s not government-approved.) Artist Anthony Antonellis implanted a chip in his hand to store and transfer artworks to his smartphone; and Grindhouse Wetware, a Pittsburg-based open-source biohacking startup, was at one point pursuing powered implants, or “subdermal devices in the body for non-medical purposes.” (Think about a body temperature monitor that controls a Bluetooth thermostat.) And then there’s Elon Musk: Musk co-founded Neuralink in 2016 to create a brain-computer interface. This week, he announced plans to use implanted microchips to connect the human brain to computers. Neuralink sees its technology being used to cure medical conditions like Parkinson’s, to let an amputee control a prosthetic, or to help someone hear, speak or see again. Having already tested its technology on monkeys, Neuralink hopes to begin human testing by 2020. Musk, however, sees a high-bandwidth brain interface as a way for humans to merge with Artificial Intelligence (or be left behind).

So, to chip or not to chip?

For enterprises who do want to experiment or ultimately adopt, here are some suggested precautions:

  • Make it optional: Implants should not be a part of any human resources policy or employment contract. It should be a choice, with the option to remove the chip and destroy its data history at any time.
  • Make sure it really feels optional: Assure there is no pressure to adopt and those who decline a chip implant don’t experience any disadvantage. Offer the same functionality perhaps in a wearable wristband option as 32M has done.
  • Make sure none of the information stored or collected is more than could be found on a smartphone.
  • Focus on controlled environments: Ex. An employee cafeteria. This makes everyday transactions in the workplace easier while reducing the chip’s usefulness to a hacker
  • Use a second security factor: Ex. Combine a cryptographic proof with a biometric option like a fingerprint or retinal scan. Add another layer of security with a Personal Identification Number (PIN) or facial recognition.
  • If the technology ever becomes standard or even required in enterprise, there need to be appropriate exemptions for religious, moral and other beliefs, individual health issues, etc.
  • Keep data protection laws in mind. Consider any information that might be collected or inferred from the data such as access info, patterns of use, etc.

Microchip implants remain a cool experiment on both sides of the Atlantic, but there is no overwhelming need or demand for the technology in the workplace right now. That doesn’t mean implantable technology won’t become socially accepted or shake up a few industries (and the human brain) in the future.

 

Photo credit: https://www.paymentssource.com/news/chip-and-skin-implantable-rfid-gives-payments-its-matrix-moment

Wearables in Risk Management: Interview with AIG’s Ron Bellows

I got to sit down and talk with Ron Bellows, Risk Strategist at AIG. What resulted is a fascinating- but long (it’s worth it) – read and a wealth of information. Ron will be speaking at EWTS 2019.

E: To begin, could you provide our readers with a little background on yourself and what you did at AIG? Also, when did you first encounter wearable technology?

R: I’ve been a risk management consultant with multiple insurance companies since 1980. I started playing with wearables probably as early as 1985/86. You may remember Cybermatics: Surface EMG measuring technology was connected to cyber gym equipment for rehab and prevention, so that when someone was working out – in the sports or medical world – you could actually see what the muscles were doing with the surface electromyography. It’s kind of like an EKG. Your heart is a muscle; surface EMG looks at other muscles.

Around ’86, I was working with a physical therapist doing studies on sports medicine and rehabilitation that might have application in the industrial environment. Many workers’ compensation injuries are expensive strain and sprain injuries impacting the musculoskeletal system. Biosensors, from a rehab standpoint, help us manage treatment for someone who has had a musculoskeletal injury. It began in the sports world and medicine, and around 2000 it started to become more pervasive in the industrial environment.

If you think about an athlete who over-trains, the same thing can happen in the industrial world.  Biosensors can measure posture, force, repetition, etc.; and be used to look at someone in the workplace from a preventative standpoint as well as on a pre-hiring/screening basis (i.e., can you handle the physical requirements of the job?) If you took a traditional physical, you might pass, but could you go in and work on a factory floor or warehouse for 8-10 hours a day, 6 days a week? Biosensors to better evaluate somebody before they go on the job to help assess their ability. Second value would be to evaluate somebody in the job to document the exposure they face due to fatigue, endurance, force, repetition and posture—the things that generally lead to ergonomic/ bio mechanic injuries. If you can detail that exposure prior to injury you can do a better job with prevention, training and hiring. However, if somebody does get hurt you can use those same biosensors to help assess exactly where and how badly they are injured, the best treatment options, and if they truly are ok to go back to work again. Those are the three main areas where wearables fit into the industrial arena and workers’ compensation (WC).


E: What exactly do you do at AIG?

R: I’ve consulted with large multinational customers to help them find solutions to their risk management issues. Often, they were most interested in workers’ comp risk because it tends to drive loss frequency and severity, impacts the workforce and absenteeism, and reduces efficiency and profitability. Workers tend to be 30-50% of a company’s operating expense, so if you can reduce injuries you can increase efficiency, profitability, etc. Today with the shortage of workers that we see, a lot of companies are working at a 20% absenteeism rate. Imagine what happens when you can’t find enough people to man the tasks in a factory. If you also have extensive injuries that put people out of work or on restrictive duty, it’s even more difficult to run the business. Making sure people can work safely and come back to the job every day is very important to risk managers. I also help risk managers with issues like fleet, liability, supply chain, business continuity, and disaster recovery—anything that keeps them up at night.


E: You just mentioned a bunch of pain points like the shortage of workers. What are the challenges and pain points for AIG’s clients that are driving interest in wearable technologies?

R: There are really two things: One is traditional safety, making sure we document exposure properly so that we can prevent injuries and do better training. It’s not just job hazard analysis but also the workers’ comp system itself, which is very difficult to manage as the venues are all different and every state has different rules. If we can document exposure, we can better manage an individual pre and post-loss. Many times, we see that older, high tenure workers are driving losses. We’re seeing the average age of workers going up, especially in manufacturing, warehousing, trucking, etc. where you have extensive injuries to the shoulder and back. Those injuries are the most difficult to diagnose, treat, and return to work. If you’re older and you get hurt, it may take you weeks to get back to where you were pre-loss. Our average workforce is in the 40- to 50-year range, so when they have an injury it’s impacted by comorbidity – hypertension, diabetes, obesity – making it more difficult for them to get back to pre-injury status.

Second, many companies today are looking at exoskeletons or other interventions to reduce exposure. When you put an intervention in place you don’t know for sure how much of an impact it’s having on the individual, because everyone is different. With biosensors, we can measure the impact of different interventions and see which ones are having the greatest impact on the worker based on their exposure. For example, I would use different exoskeletons for the upper extremities versus the back, versus the legs. So, it depends on what kind of difficulties workers are having in the workplace. For example, if I have to do static standing all day on a conveyor line, exoskeletons may not be valuable, but the biosensors can tell me what’s going on with the static stress on the lower extremities, which impacts the entire body. I can then look for alternatives like automatic massage therapy, continuous stretching, compression sleeves, to improve endurance and reduce fatigue where the exoskeletons don’t work.


E: What kinds of wearable technologies are you looking at for risk mitigation? Have any solutions made it past the pilot phase to rollout?

R: There are a lot. The biosensor marketplace has exploded in the last several years. We can use biosensors like we’ve talked about from a musculoskeletal standpoint and that’s where most of the impact is seen. But you can also use biosensors to look at an environment: A construction worker going into a pit that may be lacking oxygen can use a biosensor attached to an iPhone that sends a safety signal. You can use a posture monitor for the back like I did with the Visiting Nurse Association. Nurses visiting patients by themselves can be attacked, chased by dogs, fall down stairs, etc. Having an inclinometer or GPS monitor can send an automatic ‘man down’ signal if they’re horizontal. If they can’t push a panic button, their supervisor and local authorities can be alerted to the fact that something is wrong. That’s just one example. Biosensors in chemical plants can look at oxygen-deficient environments and exposure to chemicals and send an alert right away to the individual or supervisor. So, if you’re working remotely in a plant and there’s an ammonia tank with a small leak, the biosensor can alert you to very low levels before you’re overcome. There are so many different ways to use biosensors to alert you to exposure before it creates injury.


E: In most cases are you looking for over-the-counter, out-of-the-box solutions or bespoke devices? Where is the software being made?

R: I review what’s in the market and what’s in development. I try to stay abreast of what’s available so that I can help clients make the best and most informed decisions about how to reduce exposure. There are always several intervention options that could have an impact, so I usually demo and then pilot test the options that fit the particular exposures but also their organization structure and culture. So, I’m always looking to kick the tires on everything around the market.


E: I imagine biosensors come in every form factor at this point. Is it one sensor per device or are you testing multiple metrics?

R: Let’s take posture monitoring as an example, which is huge in workers’ comp because 30-50% of a company’s losses are from strains. Everyone wants to work on musculoskeletal disorders, which also happen to be the most expensive loss type. Inclinometers which measure posture are great because force, repetition and posture are the lead drivers of strain and sprain injuries. You can do heavier work in the power zone between your shoulders and hips, but outside of neutral posture a task becomes more egregious to your body.

Many companies are doing posture monitoring; some are focusing on the upper extremities, some on the low back. Several biosensor companies have produced very good software programs to go along with the inclinometers, showing not only when someone is out of neutral posture but also how many times a day that person is out of neutral posture, for how long, at which tasks, or what times of day, etc. Some biosensors give an automatic response to the employee (like a buzz). That can be good or bad. If I can’t change my posture because the task is set up so that I have to bend a certain way, the buzzer is going to be continuous and become really annoying. That’s where I would take the data to management and operations and say: Here’s Joe and Mike doing the same job but Mike can’t handle the postures. Why? Because he’s a little older and can’t bend as well at the knees. So, posture monitoring without the dashboard is not as effective. The better the dashboard, the better data we have and the more opportunity we have to provide valuable interventions to the physical task.


E: Can the intervention involve changing the way the task is done?

R: Yes. In fact, we can even link biosensors through a software program to a camera, so that as a person moves, we can see both the physical video and an overlay of what’s going on with his or her posture and force over time. While seeing the person do their task in space and time, we’re capturing their force and posture. That becomes really powerful. We can do that over time, creating a dashboard for different tasks, and then give an employer a prioritized list of the most egregious tasks, where high force and high repetition are most likely to generate a musculoskeletal disorder. So, biosensors with a dashboard and video overlay are very powerful in exposure documentation.


E: Can you talk about some of your recent biometrics and exoskeleton projects?

R: Well, anybody familiar with meat processing knows that it’s a very high endurance, high repetition task impacting the upper extremities and back. It’s static stress on the legs, leaning, twisting and bending at the waist, and moving your arms to process meat. Every part of the body is impacted; the repetition is so intense that you’re moving a carcass every 2 seconds. You’re constantly moving, standing in one place doing the same motion over and over, and you’re usually working a 10-hour shift, 6 days a week. Operations, safety and HR know it’s a difficult task but to change the process is very expensive. You might have to move the conveyor circling the entire plant or slow it down, which operations won’t like. Or, you’re going to have to build adjustable stanchions for people to stand up on. Oftentimes in fixed manufacturing plants, it’s very difficult to change the physical process, so we look at other interventions. The biosensors give us data on where the most difficult task/positions are and where management can spend their nickels to make the best impact. You can give them engineering solutions but if they don’t have the money for re-engineering there are alternative solutions like endurance and fatigue management or job rotation, or even just ongoing stretching throughout the day. You mitigate the exposure if you can’t eliminate it. We look for engineering solutions first, but older plants especially have a hard time putting those automation or engineering changes in place.


E: How are you measuring the ROI of the different solutions you’re implementing? What are the KPIs you’re looking for?

R: Primarily, I look at two things when it’s workers’ comp-related: Loss frequency rate: The number of injury accidents per hundred employees (for example, how many strains and sprains we have per task before and after a solution is implemented) and average cost of claim: How does that cost change after the solution is implemented? We try to reduce both frequency and severity of loss.

Here’s a good example: One 24-hour plant of 400 employees had 50 visits to the nurse everyday looking for splints, gauze wraps, and other assistance. You know that the more times people are going to the nurse, the greater the likelihood you’ll have a claim eventually. We implemented endurance / fatigue solutions and then looked at the number of individuals visiting the nurse and in some tasks the number dropped by 80%. That’s telling because it takes a while for the claims numbers to mature enough to tell you statistically significant results. If I have a short change over time, is it just that introducing the solution made everyone more aware? 18 months is about where you have to be to really see a material change in losses. So, we look at other metrics like online perception and symptom surveys. I’ve used therapy to reduce endurance and fatigue injuries and after each session, we give a quick survey asking how the person felt before and after the fatigue management program. We can then see if we’re going down the right road and match up the results to the loss analysis in the future.


E: RFID devices, body-worn (biometric tracking) wearables, and exoskeletons—which category is most mature and deployable today?

R: Posture monitors. The inclinometers and GPS are the most robust and have some of the best software. RFID is good but you have to understand what the exposure is and what end result you’re trying to get to. RFID chips are really good in environments like construction, where it’s noisy, dark and dusty and vision and hearing are impaired. RFID chips give me another sense to avoid exposure. It can also be used for equipment or where people are working very remotely, to see where somebody is working in a plant and where they’ve been. But posture monitors are probably the most robust in terms of software because, again, everyone’s trying to mitigate the strain and sprain injuries. Industrial hygiene (IH) exposure doesn’t have the same frequency of loss as strains and sprains and has been controlled very well over the last 20 years; it’s gotten a lot of attention and there are so many good programs in place.


E: Is ergonomics slightly newer?

R: Ergonomics has been developing since the mid-80s, but it’s interesting that we haven’t found a silver bullet solution, so we’ve done a lot of training. Office ergonomics got a lot of attention. ‘Ergonomic’ became a buzz word and a marketing ploy, and now a lot of equipment is considered ‘ergonomic.’ For example, you can buy a snow shovel that’s “ergonomic”, but the actual exposure to the individual hasn’t really changed. Carpal tunnel syndrome was huge in the late 90s and early 2000s, then the Mayo Clinic and other studies said that the aging workforce is driving CTS more than typing. Today in the workers’ comp arena, an individual’s physical condition can be as much a factor in injury development as the workplace exposure. The comorbidity or illness can make a simple injury so much more difficult to diagnose and treat and this is why wellness and ergonomics need to be considered together. Wearables are helping us communicate exposure to the operations managers who usually hold the intervention purse strings. Ergonomists haven’t done a great job of this in the past, but the biosensors give us data on an individual or task basis that is very telling for operations, human resources and safety teams.


E: How have employees received wearables? What has been the feedback? Are there privacy concerns and how are you dealing with that?

R: A lot of the biosensors are applied directly to the skin and managers are very skeptical or skittish about that. So, in looking at which wearable is going to work for a company you have to consider the unions, female workers, people that don’t speak English etc. You have to think about having an interpreter, if someone will have an allergy to the spray or adhesive used to attach the biosensor… What if you have a lot of hair on your back? Part of my focus is always communicating these considerations to the risk manager: Given the exposure model they face and the loss profile they have, which tasks are driving the losses, what’s the exposure model for the people doing those tasks, and what are the right biosensors to use to fit their organization’s culture.


E: Are they more receptive if the sensor is in a bracelet?

R: You get better, deeper data especially from a force standpoint if you can attach something to the skin. If you can’t you have to use a halter monitor around the chest or a belt-worn device, something on the biceps if upper extremities are the issue, a bracelet on the arm etc. That’s why it’s important to know the loss profile and exposure model for the risk before adopting a wearable product–what tasks are driving loss and what options the company is willing to consider for solutions.


E: What is your hope for the future as for how wearables are developing? What’s a big development you’d like to see?

R: Right now, biosensors are really good at looking at exposure, giving us a dashboard and helping us come up with solution options. Of course, you need to know what’s available and understand the organization culture; but we’re not using biosensors to their full effectiveness in the hiring, and screening process or the post loss injury management.  In WC, early objective medical diagnosis is critical to managing loss costs especially with strain and sprain injuries, and biosensors can be a substantial benefit in that area – including developing telemedicine programs.  We’re also not always closing the loop between risk management, safety, HR and operations in terms of exposure understanding and the implementation of interventions. Consider how many workplace tasks are developed with the idea that there will be one, two or three injuries per year in that task? The answer is none, but we accept those types of metrics as part of the cost of production. We’re collecting good loss and exposure data but not integrating safety intervention into process the way we do with quality. Biosensors give me the detailed exposure information I need to express business and human cost and help qualify the rationale for the interventions needed to reduce exposure. If I can provide detailed documentation of exposure, I can communicate better to Risk Management so they can do a better job of funding exposure reduction solutions and provide the insight for stronger diagnosis, treatment and return to work practices if a loss occurs. You’d be amazed how many loss profiles show repeat injuries, which get exponentially more expensive.  Biosensors can therefore have a significant impact in all three areas of the WC exposure control model:  Hiring, Screening and Deployment; Prevention and Training and Post Loss Injury Management…

 

Photo credit: Lara de Oliveira Corselet dB via photopin (license)

Ron will present a case study at the upcoming 6th Enterprise Wearable Technology Summit on September 19.

2019: The Year of the Big Pivot Towards Enterprise AR/VR

It’s a shame that AR/VR was overhyped in 2018 because in 2019 the technology is a fixture in enterprise.
I’ll be blunt: Augmented, mixed and virtual reality were overhyped in 2018. While 2018 turned out not to be the year of AR/VR; please don’t roll your eyes when I tell you that 2019 is the year at least for enterprise, and of that I have no doubt.
Here are a few of the signs:

  • More than half of the announcements made at AWE USA 2019 (a staple on the AR/VR calendar) were enterprise-related
  • Some of the world’s biggest consumer tech companies are now entering the immersive tech space, primarily eyeing enterprise
  • The top names in consumer VR are also heavily courting the enterprise

Why? Why are AR/VR hardware and software companies pivoting to enterprise? The answer is obvious: Because enterprise is where the money is. Both AR/VR technology providers and the world’s best-known companies (end users) are making/saving big.

If you follow enterprise AR/VR, you’re no doubt familiar with Google (Glass), Microsoft, and PTC (Vuforia). Other longtime players include Atheer, Epson, HPE, LogistiVIEW, ScopeAR, RealWear, ThirdEye, Ubimax, Upskill and Vuzix. Qualcomm, Honeywell, and Toshiba (dynabook) have become fixtures on the scene, as well, and by that I mean regular exhibitors at EWTS, the only event dedicated to enterprise use of immersive and wearable technologies. Newer sponsors include Jujotech, Pico and RE’FLEKT, along with Bose, HTC and Lenovo, joining top enterprise wearable device and industrial exoskeleton makers on the EWTS roster.

Doesn’t Bose make headphones?
Yes, they do. Bose is known as a consumer audio vender, but it also makes Bose Frames, which provided exclusive audio content and set-time notifications to desert-goers at this year’s Coachella music festival. Founded in 1964, Bose is taking an audio-first approach to augmented reality today with Bose AR, not only at concerts or in automobiles but in meeting rooms, too. Audio AR is a natural fit in the Industrial Internet of Things.

The pivot
In April 2019, Oculus introduced the expanded Oculus for Business, an enterprise solution designed to streamline and grow VR in the workplace. The expanded solution adds Oculus Quest to the hardware lineup and provides a suite of tools to help companies reshape the way they do business with VR.
The following month, Lenovo launched an enterprise AR/VR headset, the ThinkReality A6, immediately positioned as a rival to Microsoft’s HoloLens. Articles spoke of Lenovo as “just the latest manufacturer to develop an AR device aimed at enterprise.” On the heels of Lenovo’s first foray into enterprise XR, HTC announced the HTC Vive Focus Plus, a new version of its Vive Pro that will only be made available to enterprise customers. Furthermore, HTC’s Vive X accelerator has been “pouring money” into enterprise VR startups.

The proof is in the toolbox
The digital transformation isn’t here; it’s underway at hundreds of companies, including household names like Ford, UPS, and Walmart.
Every year, enterprises take the stage at EWTS to share how they’re using wearable and immersive technologies. They share their experiences and best practices, their successes and failures, and then they return the following year. These “veteran speakers” are another sign of AR/VR’s secure position in the present and future of work: AGCO, Boeing, DHL, Lockheed Martin, and Porsche come back year after year to update peers from Bayer, BP, Caterpillar, Coca-Cola, Johnson & Johnson, and other Fortune 500 companies on the latest applications for the technology in their operations. EWTS speakers span industries and sectors: Airbus, BMW, Chevron, Colgate-Palmolive, Con Edison, Duke Energy, General Electric, Gensler, jetBlue, John Deere, Lowe’s, Molson Coors, Southwest Airlines, Thyssenkrupp, Toyota, United Technologies, etc. And new faces join every year—This year’s event will welcome AIG, Amazon, American Airlines, Bridgestone, Exelon, Holiday Inn, Philip Morris, Sanofi, Six Flags, and more to the stage. It’s a cycle: Attendees become users who become speakers, and the technology continues to advance.

Beyond pilots
Lockheed Martin has been a longtime advocate of AR/VR, benefitting so much from mixed reality that it’s now teaming up with Microsoft to sell mixed reality apps to other businesses in the airline and aviation industry. Rollout is growing at BMW, too: The luxury auto manufacturer is providing all its U.S. dealerships (347 BMW centers and select MINI dealers) with Ubimax Frontline running on RealWear HMT-1 head-mounted devices. Shell is also deploying RealWear’s HMT-1Z1 through Honeywell in 12 countries and 24 operational sites. And last year, Walmart announced it was putting 17,000 VR headsets in its U.S. stores for employee training. These aren’t mere pilots. At AGCO, Boeing, and other large manufacturers augmented reality is a standard workforce tool for a variety of tasks in multiple areas of operation. In the last three months alone, Fortune 500 companies in the news for using AR/VR included Audi (Volkswagen), ExxonMobil, Nissan, and even Farmer’s Insurance. Deloitte estimates that over 150 companies in multiple industries, including 52 of the Fortune 500, are testing or have deployed AR/VR solutions. The 6th annual EWTS is the proof.

Reality check
It helps that the tech is steadily improving, of course. This was the first year that I walked around AWE and was truly amazed by the quality of immersive experiences I tried. So, here’s a reality check: AR/VR is having an impact across business and industry and it’s not going away. It’s not future tech; it’s now. And it’s not just AR/VR glasses and headsets but body-worn wearables, as well, sometimes in conjunction with VR as well as in applications warranting an entire day – the third day of EWTS 2019 – devoted to below-the-neck and safety wearables. We’re talking biometric, environment and location tracking, employee ergonomics, partial and full-body exoskeletons—it’s all here today in the enterprise.

 

Image source: The Verge

Is Digital Transformation for Men? Female Factors in Wearable Tech Design

In 2015, NASA celebrated over 50 years of spacewalking. Three years later, in March 2018, the agency called off the first all-female spacewalk due to a shortage of smaller-sized spacesuits. The walk-back led to a Twitter storm, with women sharing hundreds of stories of their own ill-fitting work uniforms and oversized ‘standard’ gear; but “It’s not just spacesuits,” one woman tweeted: “It’s public spaces like bathrooms, cars, cockpits, office air conditioning, microwave installation heights, Oculus, military fatigues…an endless list.”

In December, I wrote about the phenomenon of patriarchal coding. A feeling that today’s VR headsets were not designed with women in mind set me on a trail of research that revealed I’m not alone in feeling this way and that the majority of the products and systems we use every day are designed by and for men. This phenomenon affects every aspect of women’s lives – it even endangers our lives – and it’s unintentional for the most part, which makes it all the more frustrating. Sexism is so ingrained in our society that women’s unique needs and biology (like the fact that we have breasts) are excluded from reality, even of the virtual kind.

My main point then was that wearable technologies – the body-worn sensors being integrated into organizations’ EHS efforts, exoskeletons taking a load off workers’ backs, and VR headsets being hailed as the future of job training – exhibit coded patriarchy and risk further alienating the female workforce. Wearables that are replacing or supplementing traditional PPE (personal protective equipment) cannot succumb to the same biased or negligent design as have automobiles, office buildings, etc., for the future economy and growth of the workforce depend upon improving job prospects and working environments for women.


The history of man

Women and the female perspective are largely missing from human and world history (as is often the non-western point of view) and entirely absent in the fundamental research underlying the foundations of modern life, including economics and urban planning. The star of the show is “Reference Man,” a 154-pound Caucasian male aged 25 to 30, who has been taken to represent humanity as a whole when it comes to the design of everything from power tools to the height of a standard shelf. Take medicine: Though women process drugs differently, medications are tested only on men. Cars: For decades, car safety testing has focused on the 50th percentile male. The most common crash-test dummy is taller and heavier than the average woman, with male muscle-mass proportions and a male spinal column. This is how “standard seating position” was determined. Women, however, sit further forward in the driver’s seat and thus are 47% more likely to be seriously injured in a car crash. In 2011, the US began using a female crash-test dummy, though not an anthropometrically correct one. Testing with a pregnant dummy? Forget it.


Beyond product ergonomics

It’s annoying that so many gadgets we use are one-size-fits-men, and it’s dangerous. The world is less safe for women because we haven’t been factored into the design of not only physical products but also the software behind everything. Consider navigation apps, which provide the quickest and shortest routes to a destination, but not the safest; or voice recognition and other AI tech, which is male-biased and also becoming indispensable to how we interact with our devices and how systems make major decisions affecting humanity. Google’s voice recognition software? 70% more likely to accurately recognize male speech. Apple’s Siri? When she launched, she could help a user having a heart attack but didn’t know what “I was raped” means. (Side note: the heart attack symptoms healthcare professionals are taught to identify are actually male symptoms.)

Last year, Amazon had to scrap an experimental recruiting tool that taught itself to prefer male candidates for software development and other technical jobs. How did this happen? Because the computer model was trained to observe patterns in resumes from the previous ten years, most of which were submitted by men since the tech world is notoriously, overwhelmingly male. What’s frightening is that in a 2017 survey by CareerBuilder, over half of U.S. HR managers said they would make artificial intelligence a regular part of HR operations within five years. That means women will have to combat unfair algorithms in addition to unconscious bias in order to advance in the workforce. IBM CEO Ginni Rometty says it’s up to businesses to prepare a new generation of workers for AI-driven changes to the workforce. In a world in which AI will impact – and perhaps determine hiring – for every existing job, the fact that women and minorities are disproportionally left out of the teams behind the AI revolution is tragic.


The data gap at the heart of the workplace 

Occupational research has traditionally focused on male workers in male-dominated industries. Few studies have been done on women’s bodies and job environments, so there is little occupational health and safety data for women. The uniforms in most professions are therefore designed for the average man’s body and the why behind trends like the increasing rate of breast cancer in industry remains unknown. Relying on data from studies done on men may explain why serious injuries in the workplace have gone down for men but are increasing among women workers. This despite that, for the last three years, women have been entering the workforce at more than twice the rate of men. (You do the workers’ comp math, employers.)

When we talk about using wearables for EHS applications, oftentimes we’re speaking about body-worn sensors that can detect biometric and environmental data affecting a worker’s health and safety. The software behind these applications might send an alert to the worker or wearer when a reading reaches a certain threshold, but how is that threshold – the danger zone – determined? Say we’re tracking a worker’s exposure to a particular chemical. Women and men have different immune systems and hormones; women also tend to be smaller, have thinner skin, and have a higher percentage of body fat than men—differences that can influence how chemicals are absorbed in the body. Without female-specific data, the threshold at which a wearable device is set to alert the wearer would likely be higher than the toxin level to which a female worker can be safely exposed, putting women at greater risk of harmful exposure. The problem is two-fold: We don’t have data about exposure in “women’s work” and we’re clueless when it comes to women (increasingly) working in male-dominated industries. At this point, it would take a working generation of women to get any usable data on long-latency work-related diseases like cancer.


No PPE for you

Construction is one of those male-dominated industries in which standard equipment and PPE has been designed around the male body. Though there is little data on injuries to women in construction, a study of union carpenters did find that women have higher rates of wrist and forearm sprains, strains and nerve conditions than their male counterparts. To comply with legal requirements, many employers just buy smaller sizes for their female employees but scaled-down PPE doesn’t account for the characteristics (chests, hips and thighs) of a woman’s body. Moreover, it doesn’t seem cost-effective for employers to meet the order minimum for those sizes when women make up less than 10% of the construction workforce. Giant overalls are one thing, but the straps on a safety harness not fitting around your body? How is a woman supposed to perform at the same level as a man if her clothing and equipment are a hindrance? If oversized gloves reduce her dexterity, a standard wrench is too large for her to grip tightly, or her overly long safety vest snags on a piece of equipment? Already a minority in the sector, women don’t usually complain about ill-fitting PPE. Instead, they make their own modifications (with duct tape, staples, etc.). And it’s not just women; dust and hazard eye masks designed for the Reference Man also put many men of color at a disadvantage.

Of course, it doesn’t have to be this way. A standard-sized bag of cement could be made smaller and lighter so that a woman could easily lift it. Exoskeletons might be a solution, but so is going back to the drawing board: Jane Henry’s SeeHerWork, for example, is an inclusive clothing line for women in fields like construction and engineering, fields with lucrative, equal-pay careers and massive labor shortages—fields that need women.


Designing the workplace

Guess what? Men are the default for office infrastructure, too, from the A/C (women tend to freeze in the workplace, which hurts productivity) to the number of bathrooms and stalls (a single restroom with urinals serves more individuals). According to the Bureau of Labor Statistics, women represent nearly two-thirds of all reported cases of carpal tunnel syndrome, which indicates that workstations are less ergonomic for women. Open office plans are conducive to socializing and breaking down hierarchies, right? No, they actually encourage sexist behavior. A 2018 study documenting the experiences of women in an open office designed by men – lots of glass, identical desks, group spaces – found that the lack of privacy created an environment in which female workers were always watched and judged on their appearance. Designers today are beginning to use virtual reality to design factory layouts and workstations, even assembly processes, but that doesn’t mean they’re factoring in female anatomy or putting headsets on women workers to get their input.

I spoke with Janelle Haines, Human Factors Engineer at John Deere, who uses virtual reality to evaluate the ergonomics of assembly, about her experiences performing evaluations on women workers. Most of the people she gets to put in a VR headset are male; however, there are a few female employees available at times for evaluations. “Fitting the job to the worker hasn’t [always] been a focus. Even in the last fifteen years that I’ve been studying ergonomics, there has been a huge shift in learning to focus on ergonomics. It has become a kind of buzz word…There are some jobs that have been at John Deere for years and years, since we started building combines, that aren’t a great fit for women, but going forward with new designs we’re using VR to make sure the workstations and what we design do work for women.” Ergonomics aren’t a new area of study, but Janelle points out a promising shift in thinking and a deliberateness that’s necessary “going forward.”


The future of work: Uncomfortable = unproductive

Smartphones have become standard work tools in many jobs. Men can use the average smartphone one-handed; women cannot (smaller hands). This kind of oversight cannot be carried into the next wave of mobile: Wearable technology. That women have different muscle mass distribution and vertebrae spacing, lower bone density, shorter legs, smaller wrists, lower centers of mass, etc. matters when it comes to the design and application of wearable devices like partial and full exoskeletons, connected clothing and gear, augmented reality smart glasses, and virtual reality headsets. Early decisions in developing transformative technologies can create a weak foundation for the future of that tech.

Already women are at a disadvantage in VR. As far back as 2012, researchers found that men and women experience virtual reality differently and a growing body of research indicates why. Motion parallax (preferred by men) and shape-from-shading (preferred by women) are two kinds of depth perception. What creates a sense of immersion for men is motion parallax or how objects move relative to you, and this is easier to render or program in VR. For women, it’s shape-from-shading, meaning if a shadow is ‘off’ it will ruin the immersive experience for a woman. As shape-from-shading is more difficult to emulate, most VR tech uses motion parallax. Then there are the poor ergonomics of most VR headsets for women (too heavy, too loose, etc.). Why does this matter? Because VR is being hailed as the future of learning and job training; VR is going to be crucial for filling millions of vacant positions and for upskilling the workforce as automation advances. When one half of the population experiences the technology differently than the other half, that’s an unequalizer, especially when all indications point to people spending more time in VR in coming years.


Stop defaulting to men 

The long legacy of researchers overlooking women – not wanting to pay for double the testing – has looming implications at a time when we’re collecting data from more and more ‘things’ and powerful computers are making important decisions for us. It’s bigger than a spacesuit; we’re making decisions based upon biased, incomplete data, feeding that data into algorithms that can exacerbate gender and other inequalities, create risks among certain populations, and encode prejudices into the future. The answer? First, inject more diversity into the labs and back rooms where the future is being designed and engineered. Second, hire female designers and stop using men as a default for everything!

 

 

In writing this article, I drew heavily on the efforts and writings of a number of inspiring women; including Caroline Criado-Perez, author of Invisible Women: Data Bias in a World Designed for Men,” Abby Ferri of the American Society of Safety Professionals, and Rachel Tatman, research fellow in linguistics at the University of Washington.

 

The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 6th annual EWTS will be held September 17-19, 2019 in Dallas, TX. More details, including agenda and speaker lineup, available on the conference website.

XR in HR: AR/VR for a Different Kind of Training in the Workplace

A report released last year by the Equal Employment Opportunity Commission (EEOC) contained some shocking findings:

  • 45% of harassment claims made to the EEOC are sex-based.
  • At least one in four women experience sexual harassment in the workplace.
  • Around 90% of employees who experience harassment – whether sexual or on the basis of age, disability, nationality, race or religion – do not file a formal complaint.
  • 75% of victims who do report harassment experience retaliation.

The bottom line

Every year, sexual and other types of harassment cost companies dearly in time and money. According to the Center for American Progress, workplace discrimination costs businesses approximately $64 billion annually. Hostile work environments also negatively impact productivity, contribute to high turnover, and harm a company’s reputation. And it’s not just harassment. According to McKinsey, unconscious bias is a 12 trillion-dollar issue, which means we could add $12 trillion to the global GDP by 2025 by ‘simply’ advancing gender parity and diversity in the workplace. Gartner finds that inclusivity is profitable, especially at the executive level—inclusive companies outperform industry standards by 35%, generate 2.3 times more cash flow per employee, and produce 1.4 times more revenue. Evidently, diversity pays in money, innovation, decision making, and recruitment.

In compliance with federal and state laws, Fortune 500 companies and startups alike spend more than $8 billion on anti-harassment and diversity training each year. Nevertheless, the above stats are not improving; in fact, at current rates, it will take over a century to achieve gender equality in the workplace. Lab studies show that today’s methods for diversity training can change a person’s attitude for only about 30 minutes and can actually activate a person’s bias. Harvard studies of decades’ worth of data back this up, showing that diversity training is largely ineffective and even counterproductive.

Corporate diversity programs are failing. Harassment training at work is not making an impact. Only 3% of Fortune 500 companies today disclose full diversity data, while 24% of employees say their superiors fail to challenge sexist language and behavior in the office. What to do?

Current methods

Most onsite sexual harassment training consists of a speaker, video and/or awkward roleplaying. There are also classroom-style slide presentations, seminars, written content, and online courses. In other words, traditional corporate harassment prevention training is pretty lackluster and unlikely to end a culture of enabling harassers and dismissing victims’ claims. It’s now standard for employers to offer anti-harassment and discrimination training, but bias training for hiring and performance reviews is less common. This is a serious weakness, for employees who don’t understand their bias don’t know when that bias influences critical business decisions.

A better way

Virtual reality is gaining traction in enterprise for job training, especially for industrial environments. Studies show that people are quicker to understand abstract concepts and retain information longer in immersive environments compared to traditional training methods. Used by professional sports players and manufacturing workers alike, VR can create muscle memory (ex. operating heavy machinery) and simulate an infinite number of real-world customer service scenarios (soft skills training), but can the technology change attitudes?

Stanford researchers have been studying the impact of VR on human behavior and the medium’s ability to inspire empathy. In a recent study, they found that VR is more effective than our imagination for combating inter-generational bias. Because VR requires less cognitive load yet feels real, it encouraged subjects of the study with negative group attitudes to adopt the point of view of the “other.” If VR can affect cognitive behavior at the heart of real social issues, it suggests a profound tool for changing workplace culture.

The first time you can actually walk in someone else’s shoes – real uses cases of VR for anti-harassment and unconscious bias training

NFL

In 2016, the NFL turned to Stanford’s Virtual Human Interaction Lab in an effort to confront racism and sexism in the league, which struggles to retain women and minorities in leadership positions. The Lab had been developing scenarios designed to unsettle the user and engender empathy. The NFL wanted to use these scenarios with league staffers and players, to put them in the role of the victim. In one scenario or virtual simulation tested by the NFL, the user’s avatar was that of an African American woman being angrily harassed by a white avatar. When the user would reflexively lift his arms in self-defense, what he saw was his “own” black skin.

Equal Reality

In 2017, Equal Reality gained attention for its VR unconscious bias training. Unconscious bias is the most universal and stifling barrier to women’s progress in the workplace. Examples of unconscious bias towards women are reflected in findings such as:

  • Female employees negotiate as often as men but face pushback when they do
  • Female employees get less access to senior leaders and mentors
  • Female employees ask for feedback as often as men but are less likely to receive it than their male counterparts

Equal Reality develops virtual simulations, in this case workplace scenarios in which users interact, taking on multiple perspectives in order to learn to identify examples of pervasive bias as well as more subtle discriminatory behaviors. In 2018, realizing that paid actors and ordering a bunch of sailors to sit in a classroom and talk about behavior were doing nothing, the Royal Australian Navy adopted Equal Reality’s solution. Wearing a headset and holding two controllers, sailors are able to experience what it’s like to be in a wheelchair, treated differently and excluded from workplace conversation because of one’s disability.

Through My Eyes

In April of this year, BCT Partners and Red Fern Consulting announced a VR program called Through My Eyes, which trains employees to recognize unconscious bias through virtual scenarios. In one simulation, the user is a bystander, observing how bias plays out in different situations. In another, the user is one of the characters in the scene. Users’ choices and reactions in the virtual environment generate data, which is fed back to them and used to customize the training to each individual.

Vantage Point

Two-time survivor Morgan Mercer started the VR corporate training platform Vantage Point, which takes VR beyond simple roleplaying to illustrate the subtleties of sexual misconduct in the workplace. Like a Choose Your Own Adventure book, the user’s response to each situation in Vantage Point changes how the scenario plays out. The scenes involve a lot of grey area and are designed to teach both men and women communal accountability. In one simulation, the user’s talking with four coworkers, one female and three male, about an upcoming conference in Las Vegas. Trying to discuss her presentation and noticeably uncomfortable as the men begin to engage in locker room banter, the woman is suddenly grabbed by her boss who tells her to “pack something fitting.” Depending on how you, a witness, respond, the narrative either escalates or deescalates.

In another simulation of a colleague’s going-away party, a male coworker approaches the new female manager taking over the position. The user must grapple with what’s acceptable and what’s not, what’s a joke and what crosses the line, and when charisma becomes chauvinism. In the end, he or she must make a choice between speaking up or calling HR.

Vantage Point has three training modules: Bystander intervention, identification of sexual harassment, and responding to harassment when it happens to you. Last year, Tala (a fintech startup) and Justworks (the payroll platform) piloted the technology. In addition, Mercer draws on scientific research to develop best practice guidelines for the solution, which she hopes will become the standard for sexual harassment training. Though it’s too soon for any hard statistics, Vantage Point is receiving a lot of interest from investors and Fortune 500 companies alike.

Protecting workers

VR doesn’t tell you how to behave; it places you in the proverbial shoes of another, compelling you to empathize with that person because it feels like whatever is happening is happening to you. Doctors today are using VR to better understand the patient experience and improve their bedside manner. Further proof of the technology’s power is its use in PTSD treatment programs and transition programs for soon-to-be-released prisoners. In enterprise, anti-discrimination and harassment training doesn’t have to be a box checked off by HR; with VR, this training might actually end real-world harassment and boost company performance.

 

Image source: Equal Reality

 

The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 6th annual EWTS will be held September 17-19, 2019 in Dallas, TX. More details, including agenda and early confirmed speakers, to come on the conference website.

Home on the VRange: Immersive Tech in Residential Real Estate

Today, the U.S. housing market is nearing all-time highs following a long recovery from the 2007-08 global financial crash, which was fueled (in part) by the collapse of the housing market itself. Despite this, the traditional Real Estate market is challenged by a number of contemporary trends. Already enduring digital disruption via websites like Zillow and StreetEasy, the residential real estate sector must adapt, adopting emerging technologies to disrupt itself from within.

CURRENT TRENDS & PAIN POINTS IN RESIDENTIAL REAL ESTATE

Urbanization

In addition to limited space and gentrification – major trends jacking up costs in urban neighborhoods – there is an unprecedented demand for ‘single-dweller’ housing in cities due to more and more young professionals choosing to postpone family life for professional and social pursuits. Startups like WeLive (WeWork) and Dwell offer innovative real estate models that address the anxieties of urban living and new socioeconomic realities. A young professional who would have scrambled to find a roommate on Craigslist (an early real estate disruptor) now seeks affordable, flexible co-living solutions like Ollie’s co-living microsuites and micro-living building in NYC that have a built-in social network and great amenities. The future of cities will be small, smart living spaces.

New Consumer

Whereas baby boomers and Gen Xers desired to settle down, millennials – impacted by high student debt and the high cost of urban living – are largely single dwellers, less inclined to marry and start families. And while older generations saw homeownership as a source of wealth, their younger counterparts are less likely or able to buy housing. Meanwhile in the suburbs outside major cities, there is a glut of McMansions built in the lead up to the 2007 housing bust but a shortage of modest ‘starter homes’ for young families. Millennials currently represent the largest market to buy and rent homes, with Gen Z soon to follow. As real estate customers, these younger generations expect on-demand information, flexibility, market and price transparency, and ease of transactions. They also value green living and perceive properties with high quality visual presentation as higher value. These digital natives are also increasingly willing to make significant purchasing and renting decisions online.

Informational Parity

In the past, high fees for traditional agent/broker services could be justified because consumers depended upon qualified real estate professionals for access to residential listings. Now, the digitally engaged consumer has a wide range of resources for market information, including websites like Zillow, Realtor.com and Trulia and other platforms that have made market analysis available to the public. The average buyer or renter today can perform sophisticated searches and compare listings – a service that was once the exclusive domain of realtors – all for free on his or her own time. Though agents no longer have an informational advantage, their role is not obsolete–their specialized assistance is desirable for negotiating and dealing with inspections, escrow, insurance, co-op boards, etc. With digital competitors firmly entrenched, traditional realtors need to focus on differentiating their services, capitalizing on the fact that although people are digital-first real estate transactions will always be emotionally-driven, human decisions.

CURRENT STATE OF TECHNOLOGY IN REAL ESTATE

Residential real estate has gone through several waves of digital disruption, including the rise of online portals that have come to dominate the real estate search. Online platforms have also helped streamline many purchasing, rental and leasing processes, with some companies now offering fully integrated, end-to-end solutions for buying and selling homes and even generating mortgages. Evidence of the rise of ‘Proptech’ or ‘REtech’ can be seen in newly created CIO positions at real estate firms, while the technological readiness of homes is becoming a key selling point for consumers desiring smart and connected, energy-saving home solutions.

Despite the rise of the Internet and the importance of a home’s digital listing, staging a property – making it attractive to visiting buyers to boost its perceived valuation – remains key to a listing agent’s success in marketing and selling a home. This may include purchasing furniture for an empty space, repainting and refinishing, and/or rearranging items in an existing space, which takes time and money. Though good, old-fashioned yard signs remain a hallmark marketing tool for listing agents, emerging technologies are steadily creeping into residential realty. Common real estate marketing practices like distributing expensive paper brochures, staging properties, and even the construction of model homes will have to be reconsidered as new technologies emerge, offering potentially cheaper and more effective alternatives.

POTENTIAL FOR AR/VR IN RESIDENTIAL REAL ESTATE

Real estate professionals are finding creative ways to incorporate advancing technologies. The ability to remotely interact with clients, for example, is reducing the need for physical office space and travel, which in turn reduces overhead while permitting wider outreach. Over the last decade, ExP Real Estate has grown into a billion dollar real estate brokerage in North America, all without housing their agents in offices. Instead, they’ve built a sprawling organization that meets for training and strategic planning only in the virtual world. ExP may be an outlier in using tech to eliminate the costs of office space, but it points to the powerful potential for emerging technologies to disrupt the real estate sector.

Robust digital strategies, including the effective application of augmented and virtual reality, will be key to realtors’ success in reaching clients with independent access to market intelligence. A real estate transaction is still a high-stress event; the financial stakes are high for both buyer and seller, but immersive technologies can help facilitate efficient communication among all parties and alleviate what is typically an emotionally-charged process. For smaller real estate organizations and independent brokers, understanding the potential of AR/VR will be just as critical as for larger firms if they are to keep pace with technology and compete.

APPLICATIONS FOR AR/VR IN REAL ESTATE

VR Tours

Today, the Internet is a person’s first stop in the search for housing, but it’s hard to make a listing stand out among hundreds or thousands of similar listings online. Listings with VR tours, however, can effectively showcase a property and help hasten a sale or rental without the need to go to dozens of open houses in the company of a realtor. The immersiveness of VR means users can freely explore a realistic rendering of a property from the comfort of home and make an informed offer. Homeowners anxious to close quickly and fetch the best price can have greater confidence in a listing agent who uses high-quality, interactive VR models to market their property.

Camera companies like Matterport and GeoCV make high-quality virtual mapping fast and accessible, producing virtual scale models of properties that can be toured wearing a VR headset or examined from an overhead ‘dollhouse’ view. Lower-quality VR models you can walk through can even be created from photos taken on a smartphone. Of course, for consumers who don’t own a VR headset, VR tours can be enabled for mobile or desktop and real estate agents are also equipping their offices with VR devices. For out-of-state or just very busy homebuyers unable to visit a property due to time or distance, VR allows them to visit and revisit a home from wherever, providing answers to the questions normally fielded by an agent on site. In this way, VR can accelerate real estate transactions.

Augmented Agent

An agent’s commission is a predetermined percentage that doesn’t account for the time it takes to close a scale, which means technological solutions that reduce routine informational queries and travel are worth exploring. In some cases, the agent hosting a property is there only to unlock the door and entertain browsing visitors who may not be serious buyers. Augmented and virtual reality are excellent technological stand-ins for a human agent seeking to maximize productivity.

Innovative rental companies like Tour24 take advantage of facial recognition technology to grant – via mobile app – prospective renters access to apartments without an agent or tenant present. A beacon-activated informational tour unfolds via smartphone as the potential renter moves through the property. Taken a step further, open houses might come with AR smart glasses used to scan QR codes and view heads-up commentary at various points of interest. In this way, an agent could accommodate a prospective client’s schedule, giving them secure access to the home, and customize the tour without having to personally attend. Even classic marketing practices like planting a realtor’s sign can be taken to the next level with AR: Compass Real Estate, for example, has rolled out beacon-enabled signs that flash at passersby.  Similar AR-enabled signs scanned via smartphone could provide property information and statistics accompanied by a prompt to contact the agent.

Virtual Model Homes and Virtual Staging

It’s challenging to describe and sell a property that hasn’t yet been built. In most cases, a homebuyer or renter is in the market to purchase a vision for a property, so how that vision is presented is key. AR/VR technologies are powerful tools for bringing a future property to life, enabling tours of properties still under construction and making it possible for potential buyers to visualize spaces that do not yet exist. When marketing a home in progress, the immersiveness and detailed accuracy of a virtual reality model can supplement or entirely replace the usual promotional pamphlets and 2D or physical scale models. Brochures might be AR-enabled, while mixed reality could enable effective on-site tours, helping visitors see the potential of an unfinished, undecorated property and come to a decision before seeing the finished product.

When looking for a home, you have to imagine what it would be like to live in an unfamiliar space. While a good agent is able to anticipate what the client is looking for in a property, much of the decision making process comes down to the initial impression of an open house. Realtors often hire staging companies to bring in furniture and decorate homes before going to market. These companies generally stick to neutral decor, aiming to appeal to the greatest number of interested buyers. A couple with four children, however, seeks very different features in a home than a young bachelor or older couple with grown children. But what if you could provide a personally compelling visual narrative of the same space to individuals with varying tastes and requirements? Of course, you cannot physically rearrange a staged home for every potential buyer but with AR/VR you can help onlookers transcend a property’s current physical state, which might push them to make an offer. Staging can become a personal experience offering an array of design configurations depending on the client.

Imaging solutions from virtual staging startups like RoOomy, which counts Sotheby’s among its clients, overlay furnishings and interior designs into virtual models of empty properties captured via Matterport’s technology. An imaginative agent could put in a virtual jungle-gym or swimming pool, a pool table, built-in bar, home office, etc., customizing the virtual presentation down to the most minute details to be most effective. There are multiple benefits to virtual staging, including money, time and resources saved on temporary furnishings and meetings at the property itself, the ability to stage multiple interior design schemes and the opportunity to cross-market the services of partner businesses like interior designers and furniture manufacturers who might share the development costs of the virtual staging.

Conclusion

Digital platforms have produced an expectation of ease and access that has disrupted most corners of the real estate industry. A trend towards vertical integration of these platforms threatens to further encroach on the markets of traditional realtors. Real estate professionals must evaluate how emerging technologies like AR/VR can help them compete and create an irreplaceable role for themselves.

 

The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 6th annual EWTS will be held September 17-19, 2019 in Dallas, TX. More details, including agenda and early confirmed speakers, to come on the conference website.

Alternative Enterprise Wearables: Vests, Visors and Hearables

What is the most successful piece of wearable technology in human history? Arguably, it’s the hearing aid. In fact, hearing aids might be considered the original hearables. Yes, I said hearables. Wearables are a broad category of devices – broader than you might think – encompassing not only smartwatches and smart eyewear but also embeddables, hearables and ingestables—any connected device that can be worn somewhere on or inside the body. We can extrapolate this to define an enterprise wearable as any electronic device that a worker wears (or ingests) to improve his or her performance and safety in some way. Then, there are items of clothing and gear equipped with today’s advanced sensors. In many industries, these wearables are a worker’s last line of defense against injury in the workplace. Read on for some alternative enterprise wearables – non-watch and non-eyeglass form factors – under development or currently available for enterprise:


Smart Suspenders and Other Accessories

Amazon has over 100,000 robots in its warehouses. Funnily enough, as efficient as these robots are at moving containers of items to help human pickers fulfill millions of online orders, they (or rather their on-board sensors) aren’t all that great at recognizing their human coworkers. In human-robot workplaces, most accidents occur during non-routine actions. At Amazon, robots operate within a designated area or enclosure, but if one breaks down or drops an item, a human employee must enter that space and that’s when a collision is most likely to occur.

Over 2018, Amazon introduced the Robotic Tech Vest (RTV) to more than 25 facilities. Though called a vest, the RTV is more like a utility belt with suspenders that sends a signal to the robots when a human is nearby. The RTV can actually signal the wearer’s presence from farther away than the point at which the robot’s built-in sensor tech can recognize a human being, adding an extra layer of safety to the robots’ ability to scan for obstacles. This also gives the robot more time to slow down and reroute so as to avoid a collision. Amazon has reported that in 2018 the RTVs alerted robots to avoid human workers over a million times.

Other items of clothing and gear can be decked out with sensors to gather information, improve safety, and improve productivity across the workforce. There are heated jackets and cooling vests for extreme work environments, even self-charging work boots that track fatigue and provide lighting for jobs in low light. Such wearables could also be used for geofencing, alerting employees upon entering a restricted or unsafe zone. Earlier this month, Fraunhofer presented a prototype of another smart vest called the ErgoJack, a wearable soft robotics system with real-time motion detection and analysis. Designed for workers who lift heavy objects or spend long hours bent over a component, the ErgoJack can distinguish between ergonomic and unergonomic movements and alert the wearer in real time to prevent back pain and premature spine wear.


On the High Seas 

Working (and vacationing) on the open ocean comes with risks, especially in treacherous conditions far from the shore with limited visibility should someone get lost at sea. There have been a number of IoT projects and products aimed at improving safety at sea, including the EU project LYNCEUS2MARKET (L2M) and In:Range by ScanReach. Launched in 2015 by a team of cruise ship owners and operators, ship builders, maritime equipment manufacturers, industry associations, and tech companies; L2M came up with several wearable devices, including a life jacket that locates passengers in an emergency situation.

During a maritime emergency, there is often limited personnel available to assist. With In:Range, crew members on a vessel or offshore installation wear low-powered smart wristbands that tether users to sensors located throughout and outside the ship. This keeps the crew accounted for, allowing people from fleet management, coastal services, rescue departments, insurance companies, etc. to locate them in real time and, if necessary, intervene with a targeted rescue operation. In addition to real-time location, In:Range can also act as a safety alarm, means of area access control, and man-overboard device. To protect sailors’ privacy, the wearer’s location is not tracked until an alarm is triggered by motions indicating stress or by the wearer herself.


Personal Blinkers

As I sit here writing this, despite having classical music blasting in my noise cancelling headphones, I’m distracted by the numerous phone conversations taking place in my office and especially by one coworker who paces while on the phone (you know who you are). This is why I’m rooting for a recent prototype developed by Panasonic’s design studio Future Life Factory. Wear Space – a curved, flexible strip that wraps around the back of the head and extends like a shield for your peripheral vision – is designed to help people focus by limiting noise and other distractions in busy work spaces and open-plan offices. Essentially, these “wearable blinkers” block off the wearer from his immediate surroundings, providing instant personal space. Fitted with noise-cancelling headphones to block out ambient sound, the Wear Space can also be adjusted according to the user’s desired level of concentration. As open office plans grow in popularity and remote working becomes a norm, a device like Wear Space could do very well. Panasonic hopes the technology will be able to cut users’ horizontal field of view by around 60%.

Did you know that noise can harm you at work? Each year according to OSHA, 22 million workers are exposed to potentially damaging noise on the job, so UK startup EAVE developed hearable tech to protect people’s hearing in the workplace. Consisting of a headset and cloud-based noise monitoring platform, the technology not only protects the wearer from excessive noise in loud industrial environments but also gathers data about onsite noise levels which is used to create a heat map of noise in the workplace. The system, launched earlier this year, is meant to prevent noise-induced hearing loss, tinnitus and other hearing-related conditions. In addition, it creates an audit trail for the organization in case of future occupational hearing loss claims.


In and Behind the Ear

According to IDC, the wearables category is expanding to include hearables and the enterprise hearables market in particular is growing, with solutions aimed at offices/shops as well as more industrial environments. A recent Bloomberg Businessweek article titled “The Future of Wearable Tech is Called a Hearing Aid” is all about Livio AI, a new product from longtime hearing aid maker Starkey. Described as “a hearing aid for people who don’t need hearing aids,” Livio AI are barely visible hearables that use tiny sensors plus artificial intelligence (AI) to selectively filter noise, track various biometrics (steps, plus soon heart rate, blood pressure and more vitals), translate 27 languages near instantaneously, and detect falls. With accompanying app Thrive, Livio AI wearers can also choose to amplify specific sound sources (ex. a business colleague sitting across from you in a busy restaurant). Starkey is pitching the platform to doctors and patients, with an expected price of around $2,500 to $3,000.

It’s not difficult to imagine how a discreet in-ear computing device could improve communication (enhance listening, eliminate language barriers) and increase safety (health tracking, equilibrium/fall detection) in the workplace. The ear is actually superior to the wrist as an ideal location for sensors, which explains why a number of smart headphone and hearable startups have been popping up; but why should augmented hearing benefit consumers and not workers, as well? Besides outputting great quality sound, hearables filter out sounds, provide more accurate vital sign data (heart rate, body temp, pulse oximetry, etc.) and might be used for biometric personal identification in secure workplaces. In fact, the NEC recently announced hearable technology that uses sound waves to identify someone based on the size and shape of that person’s ear. More invisible than a pair of smart glasses, hearables could also provide workers with instant, hands-free access to information via voice commands.

You may have heard of the Smart Cap; well, startups Bodytrak and Canaria have developed smaller hearable devices that, like the Smart Cap, monitor occupational fatigue. According to studies, workers suffering from fatigue are almost three times more likely to put themselves or a colleague in danger. Bodytrak’s non-invasive, in-ear device measures a worker’s core body temperature, heart rate (a great indicator of cognitive fatigue), V02 and motion. This data is then sent to a cloud-based analytics platform that provides early warnings to at-risk workers via the hearable. Canaria’s technology is worn behind the ear, next to the skin. It monitors blood oxygen levels and heart rate, can detect harmful gases, and alerts wearers when it’s time to take a mandatory break. Both hearables might be used by workers in harsh, remote environments (ex. a building site in wintertime), factory employees working extended hours during peak season, laborers maneuvering heavy machinery, even nurses with back-to-back shifts.

 

Image source: Panasonic

The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 6th annual EWTS will be held September 17-19, 2019 in Dallas, TX. More details, including agenda and early confirmed speakers, to come on the conference website.

AR/VR Innovation at Nissan, Adidas, ADT and More

Emerging technologies are taking root across industries. Learn how a wide variety of enterprises are applying new technologies in this summary of the most recent use cases of AR/VR and wearables:

Fast and Secure Customer Service via AR

Customer support is a key consideration for companies purchasing expensive, mission-critical equipment. When an urgent repair is needed, inefficient customer support practices can unnecessarily prolong costly disruptions to operations. Swiss machinery manufacturer Bobst understands that continuous improvement of their customer service practices is important to guarantee the integrity of their products and earn customers’ loyalty; which is why the company recently deployed the Helpline Plus AR system. This was intended to boost Bobst’s capacity to respond to customer requests quickly and efficiently, and indeed the solution did improve the performance of Bobst’s help desk technicians.

Augmented reality (AR) gives Bobst’s technical experts the ability to remotely diagnose and remedy a customer’s problem from anywhere in the world. Heads-up AR headsets deliver a two-way video and audio connection over a secure WiFi connection for real-time, visual remote guidance. With the customer wearing an AR headset, a support center-based technician can inspect the machine in question and give easy-to-follow troubleshooting and repair advice and instructions. The ability to observe remotely and instantly prevents mistakes and confusion in issue resolution and limits the amount of downtime for the customer, generating savings for both vendor and customer and multiplying the value of Bobst’s well-trained techs. Already boasting a strong customer support system, Bobst now sees AR-enabled, see-what-I-see communication as a powerful tool for service support that merits a worldwide rollout.


Continual Innovation on the Assembly Line at Boeing

Building a plane is a massive project. Production efficiency is a top priority, and the scale and complexity of the plane manufacturing process amplifies the consequences of a tiny mistake. Boeing has teams that evaluate every minutiae of the production process for possible optimization. For example, the company is set for a company-wide deployment of a Bluetooth-enabled smart wrench that measures the torque applied to a nut. The introduction of self-driving work platforms on the assembly line will be a significant innovation to cut time lost on the assembly line, promising to improve monthly production of 787 Dreamliners from 12 to 14. That one piece of technology could produce such a boost in output is remarkable, but the impact it achieves is only possible in combination with other innovations that have been regularly introduced by Boeing. Workers on platforms can now work seamlessly without the interruption of using a forklift to move the scaffolding of a workstation, which saves time and reduces the risk of accidents. Many of Boeing assembly line workers wear industrial exoskeletons to greatly reduce the strain of repetitive movements, in addition to using connected tools like the ‘smart’ wrench and AR glasses for workflow support.

Boeing’s innovative solutions are created by multidisciplinary teams of Boeing engineers who operate in small ‘innovation cells’ within factories where they use virtual reality to test their ideas. A recent breakthrough in one cell led to the implementation of a 3D-printed, curved ruler that reduced the time needed to execute specialized inspection tasks within a plane’s cabin by over five hours. The greater precision achieved by leveraging emerging technologies to transform existing processes can also reduce the need for some inspections overall. Industry leaders like Boeing continue to astound with their almost continuous development of innovative and effective applications for emerging technologies on some of the most sophisticated production lines in the world.

Hear more about Boeing’s use of emerging tech from Christopher Reid, Brian Laughlin, and Connie Miller at EWTS 2019 this September in Dallas.


VR Helps Adidas Corporate Teams Find Their Stride

In today’s corporate world, departmental silos create gaps in communications, leaving key decision makers to operate with limited information. Visibility and accessibility across departments and disciplines is critical to effective communication and collaboration in an organization, a problem Adidas identified in its own process for bringing new shoes to market.

Adidas’ answer for getting teams on the same page to deliver a shoe from design stage to a retail environment? Virtual reality. The retailer uses software supplied by The Wild to model products, build virtual marketing campaigns, and showcase new shoe designs. Holding meetings wearing HTC VIVE VR headsets allows cross-departmental decision makers to better communicate ideas and demonstrate designs. VR makes inherently spatial design concepts clearer and provides greater transparency into a project overall, putting stakeholders with varying expertise coming from offices that usually have little contact with one another on the same page and reducing the back and forth that can stifle global collaboration efforts. Having VR models of new designs readily available for scrutiny means that flaws can be identified and remediated before a product enters the costly production phase, ultimately speeding up the delivery of the product to market. In addition, other areas of Adidas’ business can use the shared 3D library to visualize and iterate products and marketing strategies in virtual retail spaces based upon the company’s real stores.

Hear more about Adidas’ use of emerging tech from Brooks Clemens at EWTS 2019.


VR Marketing: ADT’s Alarming Simulation Gets in Customers’ Heads

Safe at home on your own? ADT’s latest marketing campaign, developed in collaboration with Harte Hanks, brings the danger right into your bedroom. For the campaign, ADT shipped makeshift VR headsets to select households. With these, consumers were able to view an immersive YouTube video simulation of a house fire and ADT’s coordinated response with the local fire department. The virtual experience drops you in the middle of a crisis in motion, simulating the disorientation of waking up in a dark smoky room as a fire rages within the home. ADT’s campaign proved accessible, educational and engaging, a powerful emotional trigger to build brand awareness.

Marketing is an excellent space for experimentation and innovation with AR/VR, and campaigns similar to ADT’s can be conducted on a wider scale and at a lower cost in the future once VR headsets become a common household item. Enterprise applications make practicality a priority, but in marketing the incentive is to creatively connect with consumers and make a strong impression by whatever means is most effective. Innovative marketing teams will continue to toy with VR to produce novel, visceral experiences that enable brands to connect with customers.


Haptics for Better Handling

New car designs usually begin with 2D paper models and when a design is selected to advance to production, a 3D clay model is created to get a sense of the design at scale and refine the model. Expensive, inflexible and labor-intensive, clay modeling has been a standard auto industry practice for more than half a century. Now, VR is becoming widely adopted in car design, enabling designers to review interior and exterior details of a 3D vehicle model and identify any necessary changes to be made to the CAD model before a physical prototype is created.

VR, however, can fall short compared to the interactivity of sculpting a clay model; which is why Nissan recently deployed HaptX’s VR gloves. Merging the experience of the virtual world with sensory reality, the gloves deliver haptic feedback to the wearer, creating the sensation of physically shaping a car model with one’s hands. Designers wearing the gloves can feel the contours of the vehicle surface, manipulate console buttons and dials, and even grip the virtual steering wheel and drive the car. Though HaptX’s tech is currently limited (ex. you cannot distinguish textures or feel the subtle actions of gears and switches), Nissan’s use of it marks an important step towards more practical applications of VR.


Compressing the Sales Process

Swedish machinery manufacturer Atlas Copco’s AIRNET line is a range of high-quality piping and compressor equipment sold to provide complete integrated solutions for compressed air infrastructure. Atlas’ global distribution sales team markets the company’s integrated compressed air systems using components from the range of AIRNET products; but selling such a complex system can be slow and ineffective if the client cannot clearly visualize the functional layout of the system or its easy installment, operation and maintenance.

In order to improve the overall sales process and experience, Atlas Copco adopted Eon Reality’s 3D modeling and VR technology solutions. Atlas’ salespeople have been given access to a full virtual range of AIRNET SKUs to present clients with tailored compressed air infrastructure solutions. Using Eon’s tools, salespeople can create and adjust plans according to a client’s wishes without any particular technical expertise. The ability to demonstrate and swap AIRNET components in a virtual model eliminates the need to carry samples (there are over 1,000 AIRNET SKUs!); and complete quotes can be quickly calculated accompanied by a functional simulation of a system and the bill of materials adjusted with each design iteration. Installers and technicians also get access to cloud-based VR installation guidance. Using VR, Atlas Copco’s sales team is able to shorten the sales cycle and better engage clients while assuring superior VR-enhanced follow-up support.


VR for Public Outreach: Clearing the Air About Petrochemical Operations

The towering smokestacks of a chemical plant or refinery can be an ominous sight. Public misconceptions and mistrust pose a serious challenge to companies whose operations often only reach the public consciousness via news of industrial accidents and disasters. This has resulted in an unsympathetic industry image – one of pollution as opposed to cutting-edge tech and critical production – which, in turn, affects recruitment of new generations of talent.

Industry association American Fuel and Petrochemical Manufacturers (AFPM) has pushed companies like Marathon Petroleum and Ineos to improve public relations and boost recruiting efforts by using VR to ‘open’ their plant operations to the general public. A VR tour experience of Marathon’s Galvestone Bay refinery and one of Ineos’ La Porte chemical plant, made for Oculus as well as the more accessible (and cheaper) smartphone-enabled Google Cardboard, aim at demystifying the industry for consumers. Viewers virtually meet Marathan and Ineos employees, the idea being to dispel doubts about oil and gas operations and inspire students to pursue careers in the field. VR permits a higher level of engagement with and outreach to the public than previously possible for the likes of Marathon and Ineos.

 

The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 6th annual EWTS will be held September 17-19, 2019 in Dallas, TX. More details, including agenda and confirmed speakers, available on the conference website.