Using XR to See Underground: Interview with Arcadis’ Allison Yanites

Before EWTS 2019 went down last month, I had the chance to interview one of the event’s thought leaders. Check out our interview with Allison Yanites, Immersive Technology Lead at Arcadis, the global natural and built asset consulting firm.

Emily (BrainXchange): To begin, could you provide our readers with a little background on yourself and what you do at Arcadis? Also, when did you first encounter AR/VR?

Allison: I am the Immersive Technology Lead at Arcadis North America. I am currently working to find different ways that augmented reality, virtual reality and other related technologies can improve customer experience, health and safety, and quality of life. Before this role at Arcadis, I worked as a geologist on environmental remediation projects: understanding subsurface conditions such as layers of soil and rock, if any groundwater or soil contamination is present, and if impacts are static or still moving below ground.A big piece of that work was creating 3D visualizations of subsurface data to help our clients and stakeholders better understand the full picture of what is happening below ground and help determine the next steps to clean up any contamination.

A few years ago, our team developed a mixed reality visualization of one of these environmental sites, where our stakeholders could see and interact with a holographic image of the groundwater contamination of the site. That was my first real experience with immersive technology as an industry application, and it was a gamechanger for me. Working with our digital team at Arcadis, I wanted to look beyond just holographic visualizations of environmental models and see how much we can do with AR/VR across all of the types of programs Arcadis is involved with, how we can use immersive and 360 technology for design, engineering, project and management services across all markets.

E: So, you really start at the beginning of a project, with touring a site? 

A: It depends. On some projects, a lot of data has already been collected, such as sites that have been monitored for decades; on other projects we are collecting data in an area for the first time. Either way, we are taking a large collection of data and trying to understand the complex geological and chemical patterns underground, and ultimately, determine the best ways to remove chemical impacts at the site.

E: Can you speak a little more about Arcadis’ business and its customers (who they are)?

A: Arcadis is a natural and built asset consulting firm. We work in partnership with our clients to deliver sustainable outcomes throughout the lifecycle of their natural and built assets. We have 27,000 people in over 70 countries with expertise in design, consulting, engineering, project and management services, and we work with a wide range of markets and industries, including oil and gas, manufacturing, transportation, infrastructure and municipal water.

At Arcadis, our mission is to improve quality of life in the communities we serve. Whether that is by ensuring the environmental health of communities or reducing the amount of time people spend in traffic, we develop our solutions with our client’s end-users in mind. To design the most impactful solutions, Arcadis has committed to digitally transforming our business at every level of our organization. That includes training our staff on new digital capabilities, using cutting-edge technologies and then applying our subject matter expertise. We then use these tools and skills to better understand, and address, our client’s needs.

E: How is Arcadis using augmented and virtual reality? What pain points does AR/VR address?

A: Arcadis is using augmented and virtual reality in different ways across a variety of projects. Our immersive technology practice includes on-site visualization with different types of headsets, 360-degree photos, video and virtual tours, and remote assistance with AR capabilities. Generally, immersive technology is addressing four main pain points. The first is increased safety — for example, we can share access to difficult-to-reach sites with 360-degree imagery or livestream video, and bring additional staff or clients to the site virtually. Ultimately, we must keep people safe while still collecting as much data as possible. The second is speed of decision making for example, using AR to overlay a 3D design over an active construction site helps quickly identify any differences between the plan and the current project status. The third is cost reduction — for example, we can now virtually connect project teams and clients to remote sites. This reduces travel and helps reduce the costs associated with delayed communication or unplanned rework. And the fourth is enhancing stakeholder communication and collaboration — for example, virtual 360-degree site tours and remote assistance are virtually bringing staff, clients, and stakeholders to the site where they can participate in discussions about site conditions or questions on certain issues. AR/VR visualizations also greatly improve our communication of design plans or subsurface data visualization.

E: I imagine there are a lot of new demands for the built environment, especially with climate change. Do you think that AR/VR are unleashing more creativity, enabling designers to do things they’ve never done before?

A: Absolutely. There is a lot of power in using AR/VR to understand how the environment is changing, and how to prepare communities and businesses accordingly. AR and VR visualizations can communicate designs to stakeholders that address sustainability needs or flood and storm resilience. AR/VR technology also gives designers the flexibility to share their designs with stakeholders more clearly and effectively, with a greater level of detail, than ever before. When you use AR/VR to see first-hand how a flood level impacts homes and businesses, it takes on a greater urgency than it may have before. We are also using AR/VR technology for training situations, and many training scenarios are relevant to our changing environment and being prepared for the future.

E: How have customers received the technology? Was it easy for them to use? Have any clients adopted AR/VR for themselves?

A: We have had success applying immersive technology services, and it’s exciting to see this technology expand and scale in our industry. At the same time, we are continually working to apply the right technologies for the right projects, and find new ways to solve problems for clients. These technologies are a moving target; they evolve so quickly. It seems like every few weeks there is a new product, software/hardware capability, or integration that opens new opportunities for how AR/VR can be applied. In addition to gaining traction and adoption with the services and capabilities we have established, we are constantly evaluating how we can solve emerging client challenges with new and immersive technology.

E: What was piloting like? Was there an actual test period and were there any major challenges to using the technology at Arcadis?

A: Several years ago, we started with a few different pilots and tested different AR glasses, VR headsets, 360-degree cameras and various software programs to develop content. Each of the solutions or services that we have explored has been rigorously tested, and if appropriate is then developed internally or in partnership with our clients. We are still doing pilots because the space is evolving. With one particular workflow there might be an update in either the hardware or the software that offers a new opportunity, so we’ll go in and test that. The pilots are really tied to the problems we can solve and the solutions we can bring to our clients, working with them to customize what we do with these different tools.

E: Where does the content come from?

A: So far, we have developed everything on our own. We use plugins and software to create content, but the content is coming from our own project locations and 3D designs, like wastewater treatment plant designs, environmental remediation sites or highway infrastructure designs. We already work in those spaces so we have the data sets, which we can use to create the AR/VR visualizations. Through our FieldNow™ program, we have also committed to collecting data 100 percent digitally, which means we can now apply this technology to more projects than ever before.

E: How do you measure the ROI of AR/VR at Arcadis?

A: ROI varies from project to project, but does generally come back to the four KPIs: Increased safety, speed of decision making, cost reduction, and enhanced stakeholder communication and collaboration.

E: How has Arcadis handled the security part of adoption?

A: Arcadis takes data security very seriously. Our group works with our IT department to thoroughly vet each technology against industry security standards. Additionally, our use of each of these technologies is also typically evaluated by our clients to make sure it is compliant with their security protocols. Security is always a leading factor in any new technology we adopt.

E: Are there any applications you’re hoping to test in the future at Arcadis?

A: We are constantly evaluating what we can do to exceed our client’s changing expectations. As new applications and technologies become more accessible, we want to make sure we are equipped to address both traditional and emerging client challenges.

Beyond finding new ways to integrate software platforms, we are starting to leverage the internet of things and wearable technologies more frequently. As a large company that is involved in many different industries, Arcadis uses a lot of different software programs. For each software program (3D design visualization, data analytics, program management system, etc.), we develop unique workflows to create AR/VR and 360 visualizations and/or integrate with a program management system. We are always looking for new software products or software updates that make it easier to integrate AR/VR into our daily routines.

E: With sensors in the environment and wearables, I assume you’re gathering new kinds of information for these models?

A: Absolutely. We are using sensor data, which provides real-time results that can be fed into our data analytics platforms and visualized in different ways. We are also excited about platforms that can house data and be updated in a seamless way, so a whole project team across the globe has access to one central data set.

E: What are your greatest hopes for this technology?

A: As immersive technology becomes more mainstream and awareness keeps spreading about its value for industry, it is exciting to see how many ways immersive technology is adopted and applied. This technology is still so new, I am excited to follow its evolution and see what will be possible in five, 10 and even 30 years. My hope is that as the technology starts to deliver more and more value to businesses, we also see increasingly creative ways to improve quality of life in communities around the world.

Building the Future of Exoskeletons: Meet Dr. William G. Billotte

He’s working with BMW, Boeing and others to introduce standards and raising the bar in the exoskeleton market: Meet Dr. William G. Billotte, Physical Scientist at the National Institute of Standards and Technology (NIST) and Vice Chairman of the ASTM F48 Exoskeleton and Exosuit committee. I got to interview Dr. Billotte on the importance of standards and fundamental work of NIST. Read our conversation: (Full bio at the end)

Emily: To begin, could you provide a little background on yourself and NIST? When did you first start working on exoskeleton tech?

W: My background is I’m an engineer and a biologist with a bachelors and masters degree in engineering and a PhD in biology, and I’ve been working in the biology/engineering area for probably 17 or so years, providing scientific and technical guidance to different federal agencies, first responders and other organizations. I’ve worked in a number of different areas: biological detection, first responder equipment, critical infrastructure protection, etc. I’ve been in the exoskeleton area since around 2014. I work for a federal organization, the National Institute of Standards and Technology (NIST), part of the Department of Commerce (see here for some history).  I’ve been here since 2009 as a physical scientist.

I worked for the Department of Defense before I came to NIST, and I was a consultant here in the D.C. area before that as a bioscience advisor, ever since 2002.

E: What is ASTM? How did it form and who is involved?

W: ASTM is an international standards development organization and there are a bunch of standards development organizations. It’s a non-profit. NIST works with a number of similar organizations across the world. ASTM is where we set up the F48 Committee on Exoskeletons and Exosuits around 2017. We talked to a number of different standards development organizations and it seemed like the best fit was with ASTM. I’m the Vice-chairman on the F48 Committee but I’m not an employee for ASTM; it’s a volunteer-type thing. Everybody has their day job and does standards also.

Here is a link to a recent paper describing the development of ASTM F4

For your reference, there is legislation that encourages federal agencies to use and participate in voluntary consensus standards [National Technology Transfer and Advancement Act (NTTAA), Public Law 104‐113.]

E: Are the companies actually building and using exoskeletons a part of F48?

W: We’ve got around 130 members. Anyone can join. We have meetings about twice a year face-to-face and then meetings all year long sort of how we are now. We’re trying to get standards out there that meet the needs of industry. That’s how standards work in the U.S.; they come from the ground up. If you look on your computer, the USB port is just one example of the many standards that people use every day and rely upon. Similarly, we want standards so that exoskeletons can be tested and manufacturers can easily demonstrate to their users that they’re safe and reliable. We want some guidance out there, like we just passed one standard for labeling exoskeletons. How do you put labels on these and give the user or buyer some information? That was a standard to help the manufacturers label their products and provide the right info for the user—very basic stuff right now. We’re still at the beginning stages of getting standards out there for exoskeletons. It’s an exciting time because there’s a lot to do.

E: What is the exoskeleton market like today? 

W: Bobby Marinov, who is also on the F48 committee and runs a website called the Exoskeleton Report, has written a number of articles about this on his site, in Forbes and other places. He has a good snapshot of the market, which is this: In the past two years, you’ve gone from 20 or so exoskeletons being used in the automotive industry in a few places in the U.S. to almost 1,000 worldwide, and that’s just the auto industry and mainly on the assembly line. Chris Reid’s team at Boeing has done a tremendous amount of work in this area, too; Chris is actually the leader of one of our subcommittees and he’s very involved in the ergonomics community. We’re having a face-to-face meeting at the Human Factors and Ergonomics Society in Seattle in the Fall.

E: What counts as an exoskeleton?

W: You’ve gotten to the hard question here. We struggled for at least two years, even before the committee was set up, we started working on how to define the term exoskeleton and how is an exoskeleton any different than a smartwatch or smart clothing. Why is a smartwatch not an exoskeleton? It augments you, gives you different capabilities, you wear it…We had lots of discussions like that. When I see one, I know what it is but how do we define it, and that’s how we got to a definition: A wearable device that augments you physically through mechanical interaction with the body (The ASTM standard definition of an exoskeleton is “wearable device that augments, enables, assists, and/or enhances physical activity through mechanical interaction with the body.”)

We’re not trying to exclude anything. For example, there is an exoskeleton in the consumer market that helps you to ski, but there aren’t a lot of products in the consumer space (that’s the only one that I’m aware of). And we don’t use the word partial; we just say it’s an exoskeleton that just happens to be for the upper body like those that help for overhead work. Because we’re not thinking about it as a giant Iron Man suit. There’s another one, a glove that assists you in grabbing—that’s an exoskeleton.

E: What are the top 3 industrial sectors where exoskeletons stand to have the greatest impact?

W: The big three groups are industrial, medical and military. I think these are three areas where exoskeletons are going to move forward the fastest. From what I’ve seen so far, there has been a big drive in the manufacturing sector like automotive, airplane manufacturing, those types of environments. There are some possibilities in the construction industry, but it hasn’t gone as far as we’ve seen in automotive. Another great possibility is the agricultural sector. Think of anything that involves hard physical labor, a task where you have to lift something, or where there is an awkward static posture; those give you a lot of opportunities. Really, the value of exoskeletons comes down to economics: Work-related injuries, musculoskeletal disorders, overexertion—these cost billions of dollars every year. It’s really easy to justify, which is why big and small companies are looking at this. It keeps workers safe and on the job, reduces the risk of injuries, and workers can do higher quality work for a longer period of time. That’s the potential. Do we have concrete evidence for every exoskeleton? No, we need to do a lot more studies, especially longitudinal studies, but there are enough studies out there than you can see the potential.

E: Where are you right now with exoskeleton standards and why are standards so important?

W: Standards are so important to organizations and countries because they help shape a marketplace so that you can have reliable products, safe products, and the ability to sell in a fair-trade type situation on a worldwide scale.

E: Are there any studies to back up the value of exoskeletons in industrial workplaces? How do you test the devices? 

W: NIST is a metrology institute. We do research on how to measure things and help set the measures used by everyone in the U.S. We compare those measures to other institutes around the world. NIST is developing test methods; and so, yes, we are doing some testing but we’re not testing the exoskeletons to test the exoskeletons per se; we’re doing testing with the exoskeletons to figure out how we can test all of them. We’ve gotten a few exoskeletons and developed some load-handling tasks and run a number of test subjects through to test the test method, and that is being documented. That test method will then go into our ASTM F48 committee to get massaged some and at some point it will get voted on and hopefully become a standard.

E: The exoskeletons that companies can buy today haven’t gone through this testing. Is it kind of like the Wild West right now?

W: It’s not exactly the Wild West. There has been a lot of testing, but everyone has done their own testing. That is the power of developing a standard test method because then you can compare devices. Chris Reid at Boeing has tested a lot of exoskeletons, but I can’t take his data and compare it to the data from Ford. I don’t know what tools and metrics they used. That’s why we need a repeatable standard method, so any lab can use that test method and everyone can trust the results. This will lead to a standards based certification process which helps manufacturers show the basic performance and safety of their system and will allow for the end users to not have to inherit the burden of assessing the system other than for company specific applicability.

E: So, the market is kind of regulating itself right now?

W: Well, it’s like any nascent market. The only place that you have regulations right now is the medical exoskeleton market because the FDA in the U.S. regulates all medical products and there are a number medical exoskeletons certified by the FDA that are used mainly in clinics. But it’s different than what you see on an automotive line in that usually the operator of the exoskeleton isn’t the person wearing the device; it’s the nurse or therapist. Think about someone learning how to walk again after a stroke. With exoskeletons, you can give patients “higher doses” of walking in a session with a therapist, speeding up the recovery process.

E: Would the ASTM work directly with the regulatory bodies in different industries?

W: We’re hoping the standards that we develop through F48 will be referenced by regulatory bodies, even the FDA. There may not be any regulation in the industrial market.

Here is a link to the NIOSH Center for Occupational Robotics Research.  They look at exoskeletons also and their research would feed in to any industrial focused standards or regulations on exoskeletons.

E: One of my “pet areas of research” is women in the workforce. Do exoskeletons have the potential to enable more women to work in industrial sectors and is there any testing being done on the female body, which is very different from the male body (height, breasts/hips, even spinal cords)?

W: In the testing we’re doing right now at NIST, we’re using men and women. But we don’t see any exoskeletons out there that can make someone stronger than they are right now. If the job requirement is to lift 100 pounds and you can only lift 25 pounds; the devices I am familiar with won’t lift the weight for you. An exoskeleton would help the person to lift the weight more safely and with more repetition. Some may advertise about giving you additional weightlifting capability but as far as the testing I have seen there isn’t anything that can augment your strength like that. But that’s not really the issue. It’s fitting. We’ve been dealing with this issue for a long time, especially in law enforcement. Body armor was developed for a male physique and slightly modified for females and it doesn’t work very well. We’ve been working for years trying to fix that. I hope the exoskeleton community designs for females from the beginning; we’re not going to design a male-fitting exoskeleton and then slightly modify it for females. There will be exoskeletons that fit better for men and ones designed for the female body and even ones that can be easily modified for any wearer.

E: What do you hope to accomplish in 2020 and when do you think exoskeletons will become standard in industrial environments?

W: I think exoskeletons are well on their way to becoming common in the workplace. Seeing how it’s rolling out in the manufacturing sector, aerospace, automotive, etc. I think they will be even more common in 2020. I’m sort of biased but I want to see more standards so that everyone can have an increased sense of reliability and safety with these exoskeletons. Standards will also help stimulate the market.



Dr. William Billotte currently serves as a physical scientist at the National Institute of Standards and Technology (NIST).  In that position, he helps industrial, military, medical, and public safety communities with their national and homeland security standards and technology needs.  Current activities include serving as a principal scientific advisor to Army for exoskeleton standards and technology issues and serving as the vice chairman of the ASTM F48 Exoskeleton and Exosuit committee.  Prior to joining NIST, Dr. Billotte was a CBRNE scientist for the Naval Information Warfare Systems Command (NAVWAR).  For NAVWAR, he managed programs to test, evaluate, acquire and share information on CBRNE detection and responder technologies. This included supporting the National Geospatial-Intelligence Agency’s New Campus East construction, the FEMA CEDAP (Commercial Equipment Direct Assistance Program), the FEMA Responder Knowledge Base (RKB) and the DHS SAVER program.  Prior to joining NAVWAR, Dr. Billotte served as a bioscience advisor for Booz Allen Hamilton where he supported DoD, DARPA, the Intelligence Community, and DHS in the biotechnology, chemical/biological defense and responder technology areas.

Dr. Billotte holds a Ph.D. in Biology from the University of Dayton, a Master of Science in Engineering from Wright State University, and a Bachelor of Mechanical Engineering from The Georgia Institute of Technology.


*Image source: NBC News

Wearables in Risk Management: Interview with AIG’s Ron Bellows

I got to sit down and talk with Ron Bellows, Risk Strategist at AIG. What resulted is a fascinating- but long (it’s worth it) – read and a wealth of information. Ron will be speaking at EWTS 2019.

E: To begin, could you provide our readers with a little background on yourself and what you did at AIG? Also, when did you first encounter wearable technology?

R: I’ve been a risk management consultant with multiple insurance companies since 1980. I started playing with wearables probably as early as 1985/86. You may remember Cybermatics: Surface EMG measuring technology was connected to cyber gym equipment for rehab and prevention, so that when someone was working out – in the sports or medical world – you could actually see what the muscles were doing with the surface electromyography. It’s kind of like an EKG. Your heart is a muscle; surface EMG looks at other muscles.

Around ’86, I was working with a physical therapist doing studies on sports medicine and rehabilitation that might have application in the industrial environment. Many workers’ compensation injuries are expensive strain and sprain injuries impacting the musculoskeletal system. Biosensors, from a rehab standpoint, help us manage treatment for someone who has had a musculoskeletal injury. It began in the sports world and medicine, and around 2000 it started to become more pervasive in the industrial environment.

If you think about an athlete who over-trains, the same thing can happen in the industrial world.  Biosensors can measure posture, force, repetition, etc.; and be used to look at someone in the workplace from a preventative standpoint as well as on a pre-hiring/screening basis (i.e., can you handle the physical requirements of the job?) If you took a traditional physical, you might pass, but could you go in and work on a factory floor or warehouse for 8-10 hours a day, 6 days a week? Biosensors to better evaluate somebody before they go on the job to help assess their ability. Second value would be to evaluate somebody in the job to document the exposure they face due to fatigue, endurance, force, repetition and posture—the things that generally lead to ergonomic/ bio mechanic injuries. If you can detail that exposure prior to injury you can do a better job with prevention, training and hiring. However, if somebody does get hurt you can use those same biosensors to help assess exactly where and how badly they are injured, the best treatment options, and if they truly are ok to go back to work again. Those are the three main areas where wearables fit into the industrial arena and workers’ compensation (WC).

E: What exactly do you do at AIG?

R: I’ve consulted with large multinational customers to help them find solutions to their risk management issues. Often, they were most interested in workers’ comp risk because it tends to drive loss frequency and severity, impacts the workforce and absenteeism, and reduces efficiency and profitability. Workers tend to be 30-50% of a company’s operating expense, so if you can reduce injuries you can increase efficiency, profitability, etc. Today with the shortage of workers that we see, a lot of companies are working at a 20% absenteeism rate. Imagine what happens when you can’t find enough people to man the tasks in a factory. If you also have extensive injuries that put people out of work or on restrictive duty, it’s even more difficult to run the business. Making sure people can work safely and come back to the job every day is very important to risk managers. I also help risk managers with issues like fleet, liability, supply chain, business continuity, and disaster recovery—anything that keeps them up at night.

E: You just mentioned a bunch of pain points like the shortage of workers. What are the challenges and pain points for AIG’s clients that are driving interest in wearable technologies?

R: There are really two things: One is traditional safety, making sure we document exposure properly so that we can prevent injuries and do better training. It’s not just job hazard analysis but also the workers’ comp system itself, which is very difficult to manage as the venues are all different and every state has different rules. If we can document exposure, we can better manage an individual pre and post-loss. Many times, we see that older, high tenure workers are driving losses. We’re seeing the average age of workers going up, especially in manufacturing, warehousing, trucking, etc. where you have extensive injuries to the shoulder and back. Those injuries are the most difficult to diagnose, treat, and return to work. If you’re older and you get hurt, it may take you weeks to get back to where you were pre-loss. Our average workforce is in the 40- to 50-year range, so when they have an injury it’s impacted by comorbidity – hypertension, diabetes, obesity – making it more difficult for them to get back to pre-injury status.

Second, many companies today are looking at exoskeletons or other interventions to reduce exposure. When you put an intervention in place you don’t know for sure how much of an impact it’s having on the individual, because everyone is different. With biosensors, we can measure the impact of different interventions and see which ones are having the greatest impact on the worker based on their exposure. For example, I would use different exoskeletons for the upper extremities versus the back, versus the legs. So, it depends on what kind of difficulties workers are having in the workplace. For example, if I have to do static standing all day on a conveyor line, exoskeletons may not be valuable, but the biosensors can tell me what’s going on with the static stress on the lower extremities, which impacts the entire body. I can then look for alternatives like automatic massage therapy, continuous stretching, compression sleeves, to improve endurance and reduce fatigue where the exoskeletons don’t work.

E: What kinds of wearable technologies are you looking at for risk mitigation? Have any solutions made it past the pilot phase to rollout?

R: There are a lot. The biosensor marketplace has exploded in the last several years. We can use biosensors like we’ve talked about from a musculoskeletal standpoint and that’s where most of the impact is seen. But you can also use biosensors to look at an environment: A construction worker going into a pit that may be lacking oxygen can use a biosensor attached to an iPhone that sends a safety signal. You can use a posture monitor for the back like I did with the Visiting Nurse Association. Nurses visiting patients by themselves can be attacked, chased by dogs, fall down stairs, etc. Having an inclinometer or GPS monitor can send an automatic ‘man down’ signal if they’re horizontal. If they can’t push a panic button, their supervisor and local authorities can be alerted to the fact that something is wrong. That’s just one example. Biosensors in chemical plants can look at oxygen-deficient environments and exposure to chemicals and send an alert right away to the individual or supervisor. So, if you’re working remotely in a plant and there’s an ammonia tank with a small leak, the biosensor can alert you to very low levels before you’re overcome. There are so many different ways to use biosensors to alert you to exposure before it creates injury.

E: In most cases are you looking for over-the-counter, out-of-the-box solutions or bespoke devices? Where is the software being made?

R: I review what’s in the market and what’s in development. I try to stay abreast of what’s available so that I can help clients make the best and most informed decisions about how to reduce exposure. There are always several intervention options that could have an impact, so I usually demo and then pilot test the options that fit the particular exposures but also their organization structure and culture. So, I’m always looking to kick the tires on everything around the market.

E: I imagine biosensors come in every form factor at this point. Is it one sensor per device or are you testing multiple metrics?

R: Let’s take posture monitoring as an example, which is huge in workers’ comp because 30-50% of a company’s losses are from strains. Everyone wants to work on musculoskeletal disorders, which also happen to be the most expensive loss type. Inclinometers which measure posture are great because force, repetition and posture are the lead drivers of strain and sprain injuries. You can do heavier work in the power zone between your shoulders and hips, but outside of neutral posture a task becomes more egregious to your body.

Many companies are doing posture monitoring; some are focusing on the upper extremities, some on the low back. Several biosensor companies have produced very good software programs to go along with the inclinometers, showing not only when someone is out of neutral posture but also how many times a day that person is out of neutral posture, for how long, at which tasks, or what times of day, etc. Some biosensors give an automatic response to the employee (like a buzz). That can be good or bad. If I can’t change my posture because the task is set up so that I have to bend a certain way, the buzzer is going to be continuous and become really annoying. That’s where I would take the data to management and operations and say: Here’s Joe and Mike doing the same job but Mike can’t handle the postures. Why? Because he’s a little older and can’t bend as well at the knees. So, posture monitoring without the dashboard is not as effective. The better the dashboard, the better data we have and the more opportunity we have to provide valuable interventions to the physical task.

E: Can the intervention involve changing the way the task is done?

R: Yes. In fact, we can even link biosensors through a software program to a camera, so that as a person moves, we can see both the physical video and an overlay of what’s going on with his or her posture and force over time. While seeing the person do their task in space and time, we’re capturing their force and posture. That becomes really powerful. We can do that over time, creating a dashboard for different tasks, and then give an employer a prioritized list of the most egregious tasks, where high force and high repetition are most likely to generate a musculoskeletal disorder. So, biosensors with a dashboard and video overlay are very powerful in exposure documentation.

E: Can you talk about some of your recent biometrics and exoskeleton projects?

R: Well, anybody familiar with meat processing knows that it’s a very high endurance, high repetition task impacting the upper extremities and back. It’s static stress on the legs, leaning, twisting and bending at the waist, and moving your arms to process meat. Every part of the body is impacted; the repetition is so intense that you’re moving a carcass every 2 seconds. You’re constantly moving, standing in one place doing the same motion over and over, and you’re usually working a 10-hour shift, 6 days a week. Operations, safety and HR know it’s a difficult task but to change the process is very expensive. You might have to move the conveyor circling the entire plant or slow it down, which operations won’t like. Or, you’re going to have to build adjustable stanchions for people to stand up on. Oftentimes in fixed manufacturing plants, it’s very difficult to change the physical process, so we look at other interventions. The biosensors give us data on where the most difficult task/positions are and where management can spend their nickels to make the best impact. You can give them engineering solutions but if they don’t have the money for re-engineering there are alternative solutions like endurance and fatigue management or job rotation, or even just ongoing stretching throughout the day. You mitigate the exposure if you can’t eliminate it. We look for engineering solutions first, but older plants especially have a hard time putting those automation or engineering changes in place.

E: How are you measuring the ROI of the different solutions you’re implementing? What are the KPIs you’re looking for?

R: Primarily, I look at two things when it’s workers’ comp-related: Loss frequency rate: The number of injury accidents per hundred employees (for example, how many strains and sprains we have per task before and after a solution is implemented) and average cost of claim: How does that cost change after the solution is implemented? We try to reduce both frequency and severity of loss.

Here’s a good example: One 24-hour plant of 400 employees had 50 visits to the nurse everyday looking for splints, gauze wraps, and other assistance. You know that the more times people are going to the nurse, the greater the likelihood you’ll have a claim eventually. We implemented endurance / fatigue solutions and then looked at the number of individuals visiting the nurse and in some tasks the number dropped by 80%. That’s telling because it takes a while for the claims numbers to mature enough to tell you statistically significant results. If I have a short change over time, is it just that introducing the solution made everyone more aware? 18 months is about where you have to be to really see a material change in losses. So, we look at other metrics like online perception and symptom surveys. I’ve used therapy to reduce endurance and fatigue injuries and after each session, we give a quick survey asking how the person felt before and after the fatigue management program. We can then see if we’re going down the right road and match up the results to the loss analysis in the future.

E: RFID devices, body-worn (biometric tracking) wearables, and exoskeletons—which category is most mature and deployable today?

R: Posture monitors. The inclinometers and GPS are the most robust and have some of the best software. RFID is good but you have to understand what the exposure is and what end result you’re trying to get to. RFID chips are really good in environments like construction, where it’s noisy, dark and dusty and vision and hearing are impaired. RFID chips give me another sense to avoid exposure. It can also be used for equipment or where people are working very remotely, to see where somebody is working in a plant and where they’ve been. But posture monitors are probably the most robust in terms of software because, again, everyone’s trying to mitigate the strain and sprain injuries. Industrial hygiene (IH) exposure doesn’t have the same frequency of loss as strains and sprains and has been controlled very well over the last 20 years; it’s gotten a lot of attention and there are so many good programs in place.

E: Is ergonomics slightly newer?

R: Ergonomics has been developing since the mid-80s, but it’s interesting that we haven’t found a silver bullet solution, so we’ve done a lot of training. Office ergonomics got a lot of attention. ‘Ergonomic’ became a buzz word and a marketing ploy, and now a lot of equipment is considered ‘ergonomic.’ For example, you can buy a snow shovel that’s “ergonomic”, but the actual exposure to the individual hasn’t really changed. Carpal tunnel syndrome was huge in the late 90s and early 2000s, then the Mayo Clinic and other studies said that the aging workforce is driving CTS more than typing. Today in the workers’ comp arena, an individual’s physical condition can be as much a factor in injury development as the workplace exposure. The comorbidity or illness can make a simple injury so much more difficult to diagnose and treat and this is why wellness and ergonomics need to be considered together. Wearables are helping us communicate exposure to the operations managers who usually hold the intervention purse strings. Ergonomists haven’t done a great job of this in the past, but the biosensors give us data on an individual or task basis that is very telling for operations, human resources and safety teams.

E: How have employees received wearables? What has been the feedback? Are there privacy concerns and how are you dealing with that?

R: A lot of the biosensors are applied directly to the skin and managers are very skeptical or skittish about that. So, in looking at which wearable is going to work for a company you have to consider the unions, female workers, people that don’t speak English etc. You have to think about having an interpreter, if someone will have an allergy to the spray or adhesive used to attach the biosensor… What if you have a lot of hair on your back? Part of my focus is always communicating these considerations to the risk manager: Given the exposure model they face and the loss profile they have, which tasks are driving the losses, what’s the exposure model for the people doing those tasks, and what are the right biosensors to use to fit their organization’s culture.

E: Are they more receptive if the sensor is in a bracelet?

R: You get better, deeper data especially from a force standpoint if you can attach something to the skin. If you can’t you have to use a halter monitor around the chest or a belt-worn device, something on the biceps if upper extremities are the issue, a bracelet on the arm etc. That’s why it’s important to know the loss profile and exposure model for the risk before adopting a wearable product–what tasks are driving loss and what options the company is willing to consider for solutions.

E: What is your hope for the future as for how wearables are developing? What’s a big development you’d like to see?

R: Right now, biosensors are really good at looking at exposure, giving us a dashboard and helping us come up with solution options. Of course, you need to know what’s available and understand the organization culture; but we’re not using biosensors to their full effectiveness in the hiring, and screening process or the post loss injury management.  In WC, early objective medical diagnosis is critical to managing loss costs especially with strain and sprain injuries, and biosensors can be a substantial benefit in that area – including developing telemedicine programs.  We’re also not always closing the loop between risk management, safety, HR and operations in terms of exposure understanding and the implementation of interventions. Consider how many workplace tasks are developed with the idea that there will be one, two or three injuries per year in that task? The answer is none, but we accept those types of metrics as part of the cost of production. We’re collecting good loss and exposure data but not integrating safety intervention into process the way we do with quality. Biosensors give me the detailed exposure information I need to express business and human cost and help qualify the rationale for the interventions needed to reduce exposure. If I can provide detailed documentation of exposure, I can communicate better to Risk Management so they can do a better job of funding exposure reduction solutions and provide the insight for stronger diagnosis, treatment and return to work practices if a loss occurs. You’d be amazed how many loss profiles show repeat injuries, which get exponentially more expensive.  Biosensors can therefore have a significant impact in all three areas of the WC exposure control model:  Hiring, Screening and Deployment; Prevention and Training and Post Loss Injury Management…


Photo credit: Lara de Oliveira Corselet dB via photopin (license)

Ron will present a case study at the upcoming 6th Enterprise Wearable Technology Summit on September 19.

Ditching the robot, training manual, pills and stock room for XR & wearables

4 Recent Use Cases of

Wearable Technologies in Enterprise


When a worker in a robotic suit is better than a robot – Boeing

Industrial enterprises have been experimenting with robotics to replace humans in physically strenuous and repetitive tasks, but there are certain complex tasks that cannot be automated. One such job that only a skilled human can perform is wiring a Boeing 777. Boeing has received a lot of attention for using Augmented Reality in this area of assembly. In addition, the aerospace giant has been testing industrial exoskeletons for this process—work that is both too complex for a robot and poses a risk of injury.

Boeing sees the sweet spot for exoskeletons in cases where the safety risk cannot be designed or automated out of the process. Installing overhead electrical wiring certainly qualifies, so Boeing technicians may ultimately use both smart glasses and robotic suits for heads-up instructions and fatigue prevention. Though the company has yet to deploy exoskeletons on the factory floor, pilot programs are helping Boeing determine which models are best for which jobs. Moreover, exoskeletons have become more realistic since 2012, when Boeing began evaluating the technology, with several startups now offering lightweight devices under $5,000.

Teaching a new dog old tricks with XR – Honeywell

A major driver of industrial AR/VR adoption is the skilled labor shortage. As baby boomers retire and leave the workforce, the generation of workers replacing them tends to change jobs frequently. For organizations facing a critical “information leak,” it’s a waste of resources to train a millennial for a technical role he or she will move on from in a few years.

Honeywell, a multinational engineering, industrial and aerospace conglomerate, is hoping to reach millennials and close the skills gap with mixed reality. The Honeywell Connected Plant Skills Insight Immersive Competency is a cloud-based simulation tool that uses Microsoft’s HoloLens to simulate various training scenarios for Honeywell’s C300 controller. The solution allows new employees to safely train for activities like cable and power supply failure; and measures the training’s effectiveness on plant performance through data analytics.

In testing, this method of interactive, on-the-job training improved skill retention by up to 100% compared to passive, classroom-like learning and reduced training time by up to 150%. For a generation familiar with digital content and interactivity in education; sheets of paper, check boxes, etc. don’t work. Honeywell understands that with the boomers’ exodus, the old systems of industry need updating to better align with millennials’ lifestyles.

Not another drug cocktail – The Travelers Companies

The insurance company is collaborating with Cedars-Sinai (healthcare organization), Bayer (pharmaceutical company), appliedVR (creates Virtual Reality experiences for healthcare patients), and Samsung to test a non-pharmacological “digital pain-reduction kit” for managing workplace musculoskeletal injuries. The kit consists of a Samsung Gear VR headset, a Samsung Gear Fit2 wearable fitness tracker, therapeutic VR content powered by biosensors (appliedVR solution), and a nerve stimulation device by Bayer for relieving lower back pain.

Recent research led by Travelers, Cedars-Sinai and appliedVR demonstrated that VR can reduce pain in hospitalized patients and provide an alternative to opiates. The goal of the new clinical study is to “improve outcomes for injured workers by leveraging state-of-the-art technology.” Travelers is interested in discovering new, drug-free solutions for pain management to help its customers support injured employees, lower the chance of opioid addiction, and reduce medical costs.

The perfect blend of online and store shopping – Macy’s

Though it sounds counterintuitive, the department store chain plans to use both virtual reality and e-commerce to improve its brick-and-mortar sales. Macy’s announced it will bring VR furniture sales tools to 50 stores by the summer, with the vision of offering immersive shopping technology in as many of its stores as possible in the future.

Macy’s VR Showroom is powered by Marxent’s 3D Cloud service. Customers use an iPad to add and move furniture around a room; and, once satisfied with the arrangement, an HTC Vive headset to experience the finished, fully furnished space.

In the pilot phase, VR allowed users to feel more confident with their furniture choices. Not only did they buy more but they bought items that Macy’s carries but may not be available on site at every store. That’s the beauty of VR in furniture, automobile and other high-end sales—you can sell more goods with less physical retail space to showcase it. After rolling out the solution, Macy will be able to offer furniture departments in more locations.


The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.

Augmented World Expo (AWE,) the world’s largest conference and expo dedicated to Augmented and Virtual Reality, is taking place May 30-June 1, 2018 in Santa Clara, CA. Now in its 9th year, AWE USA is the destination for CXOs, designers, developers, creative agencies, futurists, analysts, investors and top press to learn, inspire, partner and experience first-hand the most exciting industry of our times.

Manufacturing 4.0: Checking In with Expert Peggy Gulick of AGCO

A true enterprise wearable tech pioneer, Peggy Gulick, Director of Digital Transformation, Global Manufacturing at AGCO Corporation, spearheaded one of the most successful use cases of Google Glass in enterprise to date. Where others saw challenges, Peggy and her team saw opportunities to turn a device that was then (2013) struggling to find a purpose into a powerful lean manufacturing tool. We last interviewed Peggy in July of 2016, before she first graced the EWTS stage. Since then, AGCO has become a poster child of Glass Enterprise, the second generation of Google Glass developed with the input of enterprise visionaries like Peggy; and Peggy herself has become a star speaker, her story undoubtedly inspiring many others. Below, Peggy answers our questions about the state of manufacturing today:


BrainXchange: What are the greatest challenges faced by manufacturers today?

PG: All manufacturers that I have spoken to seem to face similar challenges with rising employer costs (many related to healthcare) and the need to reduce operational costs while projecting longer-term strategic plans. In addition, the expectations on employers by employees and the communities that they exist in have changed. Employees expect more from their employers, including a sense of purpose. Communities expect both social and environmental contribution.

In the midst of this, there is a gap in qualified labor and the high-tech skill sets required to meet new operational budgets and strategic plans to increase quality, reduce time and cost to market.

Automation, industrial revolution 4.0, Internet of Things and big data are all being touted as responses to these shared challenges, yet most organizations have not figured out how to incorporate them into current business processes. Although these new technologies can provide relief to manufacturers, they continue to face perception challenges, identified as replacing rather than augmenting humanity.

BrainXchange: What are the effects of automation and big data in manufacturing?

PG: Currently, there are two types of companies benefitting from big data. One is, of course, big data companies, ranging from expanded infrastructures to storage, management, processing and analytics of massive amounts of collected and stored information. The second is the strategic few organizations that have found ways to incorporate the data into problem solving and to deliver the right information to the critical point of decision making. By treating big data and automation as dependent and collaborative solutions, both as drivers of continuous improvement and lean manufacturing processes, we have been able to determine the elements that are most likely to impact outcomes that matter the most –to our product and process quality, productivity and safety. Big data, unless transformed into actionable information, is meaningless.

BrainXchange: Is AGCO experiencing a “skilled labor crunch?”

PG: Yes, but we are addressing it through investment in our employees, both current and potential (apprentices). Mechatronics, assembly academy, scholarships and on the job training combined with a work environment that allows employees to contribute and feel a sense of purpose has allowed us to retain and recruit successfully. Our employees are motivated by the organization’s concern for quality products/processes and employee safety, not cost-reduced workforces.

BrainXchange: How might smart glasses and Augmented Reality help address some of the above challenges?

PG: Smart glasses and augmented reality have been deployed in our manufacturing operations to further our continuous improvement efforts across the site. The use of wearable technology helps eliminate motion, over-processing, defects and even transportation. Excessive travel to workstations to retrieve work instructions and bills of material is eliminated. Defects are minimized due to comprehensive (pictures, videos) and easy-to-access to work instructions. Our plant makes highly complex, low-volume agricultural equipment. Wearable tools help minimize over-processing caused by the need to rework due to misguided assembly. When workers can do their job smarter, faster, safer, it resonates throughout the entire culture. As we realize labor crunches, it is more and more important for companies to offer the tools and training required to create, grow and retain their employees. Smart glasses has helped us to do that.

BrainXchange: What tools do AGCO workers currently use to do their jobs? How are new workers currently trained?

PG: All of our assembly and assembly quality gate employees attend 40 hours of Assembly Academy followed by 40 hours of Lean Work-cell training. In addition to reading blueprints and interpreting supplemental information, assemblers must be proficient at hand, power and assembly tools. Since employees are now expected to use wearable tools including smart eyewear (Google Glass) to access work instructions and quality checklists, wearable tools are introduced immediately in the learning academies.  Wearable tools not only inform but also capture and flow pertinent information (including pictures, text and video) for non-conformance issues and missed thresholds.

It was critical to the success of wearables to acknowledge that all employees are not equal in training and skills. As employees’ skills mature, specific to operations, our wearable applications allow for personalized levels of instructions, tailoring them based on algorithms of training and experience.

The wearable tools themselves are easy to implement and support. Most employees are excited to wear the technology and realize the benefits quickly.

BrainXchange: Where do you see the greatest opportunities for smart glasses in the manufacturing plant?

PG: Our product design team finds great value in virtual reality glasses. Not only do they broaden the ability for a team to “see” what others are thinking, but they allow design teams to remotely interact, all in virtual glass, all seeing the same product and projected design strategies.

As a problem-solving organization and culture, we have weighed the value of wearable smart glasses in many areas, including welding, paint preparation, assembly, quality, technical services, material management and even plant tours. The first thing that we have discovered is that the projected value of replacing current tools, whether it be paper work orders or terminal work instructions, with smart glasses is 2x what we initially thought. The results have been so beneficial in some areas that we have retested, thinking it was a mistake. It is important to note that every pilot we have conducted has been in response to a defined problem. And, after 5 whys, fishbones and cross-functional involvement, sometimes even a kaizen, smart glasses are a part of the proposed solution with metrics associated. Knowing that smart glasses are a lean tool, and not an industry requirement or cool factor, we have reported 30% reduction in processing times, 50% reduction in amount of time employees train on the job (new hire and cross functional) and reduced quality and safety incidents that we are still calculating. The greatest value for the glasses has been in assembly and quality, both needing easy and quick access to hands-free instructions. As a manufacturer of complexly configured products, we have discovered that training by smart glasses is the grand slam. New product launches, multi-operation and new hire training are easily administered and audited for success.

BrainXchange: How do smart glasses further lean manufacturing?

PG: Simple. Lean is all about waste elimination. Smart glasses, when implemented for the right reasons, reduce waste. The use of wearable solutions was discovered as we did what we do best every day–solve problems (4873 problem solutions implemented by employees in 2016.)

Introducing Google Glass to our manufacturing floor was not intended as disruptive technology or even competitive advantage. They were introduced as solutions to make employee’s jobs easier and safer while driving higher quality to our product and our processes. In the end, we have accomplished both.

We are delighted that Peggy will be speaking again at EWTS 2018 this October, and cannot wait to hear how AGCO’s Google Glass success story has progressed. 


The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.

Augmented World Expo (AWE,) the world’s largest conference and expo dedicated to Augmented and Virtual Reality, is taking place May 30-June 1, 2018 in Santa Clara, CA. Now in its 9th year, AWE USA is the destination for CXOs, designers, developers, creative agencies, futurists, analysts, investors and top press to learn, inspire, partner and experience first-hand the most exciting industry of our times.


Photo credit: Google X

Virtual Reality and Production Ergonomics

The big automobile manufacturers have been using Augmented and Virtual Reality to both design cars and sell them in dealerships; but there is a step in between design and sales where AR/VR can be employed and that is production. I’m not talking about auto workers wearing AR glasses to view assembly instructions on top of vehicles as they come down the line; but rather using VR to simulate the assembly process itself and ensure the safety and comfort of those who carry it out.

The ergonomics of each step of putting a machine together – how a worker must move throughout the assembly process – is not something we often think about, yet poor production ergonomics are a source of inefficiency and can have a high physical cost for employees (+ associated financial costs for the company.) It is difficult to assess the ergonomics of a process like automobile assembly or the building of other large-scale manufacturing products by examining a schematic or onscreen model. And once the vehicle or machine hits production, it becomes too expensive to make major changes that don’t affect product performance or pose a serious threat to the end user.

Virtual Reality can help optimize production by detecting problems that might arise during assembly. In a virtual world, you can test out and analyze different scenarios and methods as they would occur in the real working environment without staging physical mock-ups or putting actual workers at risk. You could assess the dimensions and kinematics/operation of tools and equipment in relation to the human worker; adjusting these to the biomechanics and physical attributes of the assembly worker, making sure all necessary tools are within reach, and that their use will not require unsafe leaning, twisting, reaching or bending. You could even recreate the entire manufacturing plant to factor in (and ideally position) elements that might impact the process you’re designing, including other production cells, moving vehicles, and even lighting and air conditioning.

Essentially, VR enables manufacturing experts to “rehearse” assembly from the human worker’s perspective, and real enterprises are seeing benefits like increased efficiency and fewer injuries. Read on to learn how Ford and John Deere are using Virtual Reality and motion detection technology to make life easier and safer for assembly line workers:


In 2015, Ford reported that it was outfitting employees with Virtual Reality headsets and motion capture body suits to refine the design of future production lines. Ford researchers simulated the assembly process for upcoming vehicle models, running through the steps with a real person and 3D-printed (partial) mockups in a virtual workstation and using the data to spot production challenges, assess physical risks, and design ergonomic tooling years in advance of actual assembly.

Motion capture revealed how an employee would need to move to complete an assembly task; what degree of muscle strength, joint strain and body imbalance would be involved. A head-mounted display was used to study the feasibility of tasks for workers on the line and 3D printed models to look more closely at things like maneuvering and using tools in tight spaces.

The results influenced production decisions and affected vehicle design components and parts, helping to reduce the number of injuries by as much as 70%. 

John Deere

The machinery manufacturer probably best known for its tractors uses Virtual Reality in the design process to understand and optimize both how future operators will use its machines and how employees will assemble them.

Engineers tasked with determining assembly feasibility of new machines conduct VR reviews during product development. In these reviews, an operator wearing a VR headset and using motion-tracked tools is fully immersed in a virtual work environment. The evaluators watch on, seeing the user’s view on a screen, to assess things like whether the worker has enough visibility; whether her body is interfering with the machine; whether she has to assume an awkward posture or reach for a tool during the assembly; whether the tool fits where it has to go (in a particular area or part of the machine;) etc.

In this way, high-risk processes are identified and machine designs tweaked before they can cause real injury in the factory and delay manufacturing.

Manufacturing design engineers obviously think about a product’s look, and we know they focus on user experience; but the way new products are designed also affects how they are built. Designers must therefore take into consideration the actual assemblers and any potential safety pitfalls or impracticalities they might encounter in putting a product together. Virtual Reality presents a powerful tool for simulating what it takes to build a machine and refining a design to make assembly itself more ergonomic and streamlined, thereby building safety and a layer of efficiency right into the production line.


The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.

Augmented World Expo (AWE,) the world’s largest conference and expo dedicated to Augmented and Virtual Reality, is taking place May 30-June 1, 2018 in Santa Clara, CA. Now in its 9th year, AWE USA is the destination for CXOs, designers, developers, creative agencies, futurists, analysts, investors and top press to learn, inspire, partner and experience first-hand the most exciting industry of our times.


photo credit: Virtual Reality at SXSW 2017 via photopin (license)

Using HoloLens for Design and Asset Visualization

Just as there is a disconnect in designing three-dimensional structures and spaces on two-dimensional screens – and in executing and arranging 2D designs in real space – there is a disconnect in taking multiple data sets and real-time data streams in different formats and attempting to identify patterns and insights to apply in the real world. Architects and designers have been first-movers when it comes to using Augmented and Virtual Reality technologies in the design process; but there are other professions that call for digesting complex information, understanding complex situations and environments, and planning with moving parts. Below are three enterprise scenarios in which Microsoft’s HoloLens Mixed Reality headset is used as a design and asset/data visualization tool:

Workspace layouts (office space, shop floor, job site, store…)

Polamer Precision, an aerospace manufacturer, has been using Microsoft’s HoloLens Mixed Reality headset to map out its manufacturing “cell” layout. In Mixed Reality, users can test out positions for workstations and tooling and ensure that forklifts and other equipment will have room to operate. Imagine walking into a real-world environment like a job site – perhaps the site changes with every job – and having the ability to view holograms of the machines, vehicles, tools and human workers that will need to be brought on site to get the job done. It takes the guess work out of the planning process and helps avoid costly delays.

Stryker is another company using HoloLens in this way—in hospitals. The medical device company sells equipment for hospital operating rooms, helping its clients figure out ideal arrangements of equipment to create state-of-the-art ORs.

In a typical hospital, multiple practitioners from different surgical disciplines share a single OR. Figuring out how to install the equipment is not just about fitting all the items into the room; the layout also has to be practical for every doctor (and nurse) that will need to move around and operate there, not to mention safe for patients.

Instead of having all stakeholders physically present to work this out or manually moving around heavy (and expensive) equipment to test out different configurations, Stryker has been using HoloLens and its own By Design software to build and modify possible OR scenarios with holograms. AR brings Stryker’s portfolio of digital 3D assets to life, allowing for better and faster collaboration.

Financial planning

When you hear “wearable tech in banking,” you probably think of contactless payments; but financial services companies are exploring Augmented Reality for wealth management and the trading room, as a tool for interacting with large quantities of complex data and advising clients remotely. In Spring 2016, Citi – working with 8Ninths – became the first bank to reveal a proof of concept for AR in stock trading.

In the YouTube demo, you see a Citi trader no longer confined to the trading desk. He checks the news on the traditional 2D monitors that flank his workstation before putting on HoloLens. Using the holographic trading tools via voice commands, he “sees and quickly assesses a dynamic, 3D visual snapshot of what’s happening in the market right now.” Noticing a lot of activity in one sector, he has an idea for a trade that he shares remotely with a client.

In Mixed Reality, the Citi trader is better able to monitor, analyze and manipulate real-time market news and trends, bids and offers, etc. unimpaired by a lack of screen real estate. It’s digital downsizing—from 6-8 monitors composing a typical trader’s workspace to two monitors and a headset opening up an entire interactive trading world.

A year after Citi’s demo, FlexTrade, a provider of execution and order management trading systems, announced its augmented reality trading application for HoloLens. The app, called FlexAR, offered “a new way of visualizing and presenting trading,” with an interactive order blotter, trade ticket, and charting. Traders could make faster and better data-driven decisions by viewing and interacting with stock prices, volume, profit and loss, and other complex data in a virtual space. AI would add contextual information, identifying key elements like price or volume changes in real time and automatically bringing up information about a specific company or trade in consideration. Interestingly, FlexTrade found Virtual Reality too immersive for the use case.

A holistic view of enterprise operations

Air transport IT provider SITA, with Helsinki Airport, has been exploring the potential of Mixed Reality for airlines and airports. In a study, SITA Lab simulated the airport operational control center (AOCC) in MR, interfacing multiple data sources to produce a unique and dynamic, 3D view of Helsinki Airport’s operations. Operators wearing HoloLens could see and interact with dashboards of real-time operational data like passenger location, security wait times, flight statuses, gate information and retail analytics; correlating events to gain insights for better managing the airport’s operations.

These examples show possibilities for HoloLens beyond a machine overlay for maintenance and training. In ever-changing environments like busy airports and hospitals, Augmented Reality is superior to the 2D tools we currently use to view (and make decisions about) complex situations with a lot of moving parts. And for managing the complexity of data that makes up the world of banking and finance, AR is an unparalleled medium that turns data points into digital content with which users can engage.


The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.

Augmented World Expo (AWE,) the world’s largest conference and expo dedicated to Augmented and Virtual Reality, is taking place May 30-June 1, 2018 in Santa Clara, CA. Now in its 9th year, AWE USA is the destination for CXOs, designers, developers, creative agencies, futurists, analysts, investors and top press to learn, inspire, partner and experience first-hand the most exciting industry of our times.

photo credit: dalbera HoloLens, le casque de réalité mixte de Microsoft (Futur en Seine 2016 Off, Paris) via photopin (license)

2017 was the year of store closings–Can the Internet of Things solve retail’s woes?

The Internet of Things (IoT) is a network of Internet-connected objects that have been made “smart” with embedded sensors. These devices collect data about the physical world around us, about processes and the health and behavior of people and machinery—data which is then interpreted and shared with other machines and people via cloud-based software platforms. In retail, IoT solutions can improve in-store operations, optimize supply chains, deliver better customer experiences, and generate new revenue streams.    

2017 was the year of store closings. By year’s end, major retailers will have closed or announced plans to shutter over 8,500 stores. This in addition to the many brick-and-mortar retail bankruptcies, as e-commerce giants and fast fashion brands threaten traditional retail business models. How to solve retail’s woes? Is the future bleak for the industry? Has online shopping won or might the physical store survive with a little TLC and IoT?

Consumers today expect the shopping experience to be seamless across all of a retailer’s channels. That includes the brick-and-mortar store, e-commerce site, mobile app and even telephone customer service. So, when we talk about the Internet of Things in retail, we’re not just referring to the connected store of the future but rather to a system of sensors linking every aspect of a retailer’s business far beyond “buy online, return in store.”

What kinds of things can be made smart in retail? You may be familiar with beacons, RFID tags, NFC payments, and QR codes. Other retail technologies include in-store infrared foot-traffic counters, source-tagged SKUs, mobile device tracking, digital signage and kiosks, and classic video surveillance. There are electronic shelves that detect when inventory is low; proximity sensors that could be synced with digital coupons or AR cues activated via shoppers’ smartphones; and, potentially, interactive digital signs that tailor promotions to the person standing before them.

Sensors are being incorporated today into product packaging to monitor the quality of perishable goods; and digital price tags are enabling dynamic pricing, where item prices are changed in real time to reflect the most current trends in demand, inventory level and other data (i.e. surge pricing.) And on the horizon are self-scanning and self-checkout by smartphone, as well as robots that could aid in areas like product assembly, stock replenishment, and hazardous or heavy materials handling. Robots might even assist in stores by responding to spoken inquiries (voice recognition, AI,) helping shoppers to browse and locate inventory, and suggesting products based upon customers’ shopping histories. Imagine if a robot knew the last thing you left sitting in your online shopping cart and could inform you of a new reduced price or special sale for that item as soon as you entered the store.  And, of course, wearables worn by sales associates and managers, along with those on the wrists of shoppers, are another potential end point and data collector in the connected store of the future.

Let’s see how all of these “things” or potential parts of an Internet of Things strategy could impact retail operations:

In-Store and Supply Chain Operations

The Internet of Things promises to help retailers increase efficiencies through better visibility into their operations and supply chains. Inventory tracking (using RFID, smart shelves, and other sensors) is a good place to start—gaining real-time knowledge of where inventory is located and in what quantity and condition. By knowing exact stock levels, exactly when a replenishment delivery will arrive and how up-to-date goods are; retailers and their employees can make better on-the-fly decisions to meet customer needs and expectations.

Tracking inventory throughout the supply chain would help avoid out-of-stock situations that hurt customer satisfaction and result in missed store sales opportunities. Data collected in the cloud from different sources could be analyzed and delivered to the right agents like warehouse managers or store staff via mobile apps. Inventory tracking would also aid in handling unexpected surges in demand due to unscheduled events, situations in which a retailer must be highly responsive by immediately attaining and acting upon stock availability. Or, you could avoid the issue altogether by linking smart shelves in stores with a warehouse management system to automatically reorder products when store inventories reach a certain level.

Another means of reducing missed store sales is demand-aware fulfillment, where warehouse automation and robotics are driven by online and in-store shopping demand and inventory levels. By monitoring sales opportunities in real time and automating the movement of goods through the supply chain, you can reduce the chances of missing a sale due to the customer’s desired item being out of stock.

In addition to preventing missed sales, (RFID) inventory tracking can help decrease product loss and shrinkage, and increase accountability at all operational levels. For instance, an IoT solution featuring smart shelves, source-tagged SKUs and security footage would “know” whenever an item is taken off a shelf (instantly updating inventory records) and could raise a red flag if the item weren’t subsequently paid or accounted for. Goods can also be lost due to environmental factors. Tracing those goods is important for health and safety reasons, but an IoT-savvy retailer would be able to intervene before items “go bad” by using sensors to monitor variables like heat and humidity that impact perishable goods. You could move things around, prioritize one shipment over another, etc. and create an audit trail to identify the cause and/or responsible party.

The product is important, as is the employee who sells it—employees are key assets that can be tracked to help a retailer understand how to best manage them, ensuring there is always an employee prepared and in the right location to serve a customer. IoT solutions can reveal whether store associates are responsive to shoppers’ needs in addition to helping them be so by analyzing employee activity data (movement, number of sales, etc.) against factors like store traffic and customer demand and by giving them the information to work efficiently. For instance, employees could use smartwatches or smart glasses to look up real-time stock and other product info on the sales floor, never having to leave the customer’s side to walk over to a computer or “check in the back.” Shoppers needing assistance might also use an app to summon help in stores, with an automatic wearable alert sent to the nearest salesperson.

What else can IoT technologies do to improve store operations? How about managing energy consumption, a major expenditure for large stores, using smart lighting and thermostats? Or enhancing workplace productivity? An IoT solution could tell you how your employees should spend their time (helping customers or doing operational tasks) and where to place them on the floor by analyzing current and past data; including high-traffic areas, seasonal shopping trends, customers’ shopping histories and digital wish lists, delivery and inventory audit schedules, etc. And predictive maintenance isn’t just for manufacturing plants; retail stores have equipment, such as grocery store refrigeration units, which can be sensor-ized to monitor power consumption, temperature, etc., reduce product loss, and ensure food safety. With electronic inventory tracking, suppliers can be automatically notified when stock needs to be replenished and digital coupons can be offered to shoppers to improve turn on aging stock.

Customer Experience and New Revenue Streams

Another major application of the Internet of Things in retail is customer service—helping customers to shop more easily and empowering employees to better serve them. In the brick-and-mortar store of tomorrow, (wearable) contactless payments and automated home replenishment could provide ease; robots or even smart shopping carts could help customers navigate aisles; digital marketing could be customized for individuals and different regions; optimal store layouts could be designed in Virtual Reality; and shoppers could try on clothes and select automobile options virtually.

IoT solutions can help retailers connect the physical store experience to shoppers’ digital lives and thereby make store shopping more engaging and seamless. As soon as a customer enters the store, for instance, customized offers and product recommendations could be sent to his or her mobile device based upon that person’s online shopping and browsing history, app usage, and possibly activity level (from a wearable fitness tracker.) This is proximity marketing; it can be automatic (BLE beacons) or personalized (NFC tags, QR codes and now also AR cues) by leveraging the same type of rich data and advanced analytics brands use to drive e-commerce and consumers’ own devices.

Shoppers are more likely to download a retailer’s app (and give up their location) if it gives them access to those kinds of promotions or even the power to scan items and call up product reviews, find their size at other store locations…anything that makes the in-store experience as frictionless as the online one. That same rich data and instant product info can be leveraged by store associates who’ve never had access to it before; to identify loyal customers, incentivize undecided shoppers, upsell products and provide overall better service.

The possibilities for the Internet of Things in the retail industry are great: Retailers can use IoT technologies to collect real-time data on shopper behaviors but a decision management system is necessary to (automatically) orchestrate action, whether that’s directing an associate (via wearable) to assist a customer or provide relief in a really busy area of the store; remotely changing prices or tailoring an in-store display; or having warehouse workers reposition inventory elsewhere in the supply chain.


The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.

Augmented World Expo (AWE,) the world’s largest conference and expo dedicated to Augmented and Virtual Reality, is taking place May 30-June 1, 2018 in Santa Clara, CA. Now in its 9th year, AWE USA is the destination for CXOs, designers, developers, creative agencies, futurists, analysts, investors and top press to learn, inspire, partner and experience first-hand the most exciting industry of our times.


photo credit: arbyreed Cold Drinks via photopin (license)