Wearables in Risk Management: Interview with AIG’s Ron Bellows

I got to sit down and talk with Ron Bellows, Risk Strategist at AIG. What resulted is a fascinating- but long (it’s worth it) – read and a wealth of information. Ron will be speaking at EWTS 2019.

E: To begin, could you provide our readers with a little background on yourself and what you did at AIG? Also, when did you first encounter wearable technology?

R: I’ve been a risk management consultant with multiple insurance companies since 1980. I started playing with wearables probably as early as 1985/86. You may remember Cybermatics: Surface EMG measuring technology was connected to cyber gym equipment for rehab and prevention, so that when someone was working out – in the sports or medical world – you could actually see what the muscles were doing with the surface electromyography. It’s kind of like an EKG. Your heart is a muscle; surface EMG looks at other muscles.

Around ’86, I was working with a physical therapist doing studies on sports medicine and rehabilitation that might have application in the industrial environment. Many workers’ compensation injuries are expensive strain and sprain injuries impacting the musculoskeletal system. Biosensors, from a rehab standpoint, help us manage treatment for someone who has had a musculoskeletal injury. It began in the sports world and medicine, and around 2000 it started to become more pervasive in the industrial environment.

If you think about an athlete who over-trains, the same thing can happen in the industrial world.  Biosensors can measure posture, force, repetition, etc.; and be used to look at someone in the workplace from a preventative standpoint as well as on a pre-hiring/screening basis (i.e., can you handle the physical requirements of the job?) If you took a traditional physical, you might pass, but could you go in and work on a factory floor or warehouse for 8-10 hours a day, 6 days a week? Biosensors to better evaluate somebody before they go on the job to help assess their ability. Second value would be to evaluate somebody in the job to document the exposure they face due to fatigue, endurance, force, repetition and posture—the things that generally lead to ergonomic/ bio mechanic injuries. If you can detail that exposure prior to injury you can do a better job with prevention, training and hiring. However, if somebody does get hurt you can use those same biosensors to help assess exactly where and how badly they are injured, the best treatment options, and if they truly are ok to go back to work again. Those are the three main areas where wearables fit into the industrial arena and workers’ compensation (WC).


E: What exactly do you do at AIG?

R: I’ve consulted with large multinational customers to help them find solutions to their risk management issues. Often, they were most interested in workers’ comp risk because it tends to drive loss frequency and severity, impacts the workforce and absenteeism, and reduces efficiency and profitability. Workers tend to be 30-50% of a company’s operating expense, so if you can reduce injuries you can increase efficiency, profitability, etc. Today with the shortage of workers that we see, a lot of companies are working at a 20% absenteeism rate. Imagine what happens when you can’t find enough people to man the tasks in a factory. If you also have extensive injuries that put people out of work or on restrictive duty, it’s even more difficult to run the business. Making sure people can work safely and come back to the job every day is very important to risk managers. I also help risk managers with issues like fleet, liability, supply chain, business continuity, and disaster recovery—anything that keeps them up at night.


E: You just mentioned a bunch of pain points like the shortage of workers. What are the challenges and pain points for AIG’s clients that are driving interest in wearable technologies?

R: There are really two things: One is traditional safety, making sure we document exposure properly so that we can prevent injuries and do better training. It’s not just job hazard analysis but also the workers’ comp system itself, which is very difficult to manage as the venues are all different and every state has different rules. If we can document exposure, we can better manage an individual pre and post-loss. Many times, we see that older, high tenure workers are driving losses. We’re seeing the average age of workers going up, especially in manufacturing, warehousing, trucking, etc. where you have extensive injuries to the shoulder and back. Those injuries are the most difficult to diagnose, treat, and return to work. If you’re older and you get hurt, it may take you weeks to get back to where you were pre-loss. Our average workforce is in the 40- to 50-year range, so when they have an injury it’s impacted by comorbidity – hypertension, diabetes, obesity – making it more difficult for them to get back to pre-injury status.

Second, many companies today are looking at exoskeletons or other interventions to reduce exposure. When you put an intervention in place you don’t know for sure how much of an impact it’s having on the individual, because everyone is different. With biosensors, we can measure the impact of different interventions and see which ones are having the greatest impact on the worker based on their exposure. For example, I would use different exoskeletons for the upper extremities versus the back, versus the legs. So, it depends on what kind of difficulties workers are having in the workplace. For example, if I have to do static standing all day on a conveyor line, exoskeletons may not be valuable, but the biosensors can tell me what’s going on with the static stress on the lower extremities, which impacts the entire body. I can then look for alternatives like automatic massage therapy, continuous stretching, compression sleeves, to improve endurance and reduce fatigue where the exoskeletons don’t work.


E: What kinds of wearable technologies are you looking at for risk mitigation? Have any solutions made it past the pilot phase to rollout?

R: There are a lot. The biosensor marketplace has exploded in the last several years. We can use biosensors like we’ve talked about from a musculoskeletal standpoint and that’s where most of the impact is seen. But you can also use biosensors to look at an environment: A construction worker going into a pit that may be lacking oxygen can use a biosensor attached to an iPhone that sends a safety signal. You can use a posture monitor for the back like I did with the Visiting Nurse Association. Nurses visiting patients by themselves can be attacked, chased by dogs, fall down stairs, etc. Having an inclinometer or GPS monitor can send an automatic ‘man down’ signal if they’re horizontal. If they can’t push a panic button, their supervisor and local authorities can be alerted to the fact that something is wrong. That’s just one example. Biosensors in chemical plants can look at oxygen-deficient environments and exposure to chemicals and send an alert right away to the individual or supervisor. So, if you’re working remotely in a plant and there’s an ammonia tank with a small leak, the biosensor can alert you to very low levels before you’re overcome. There are so many different ways to use biosensors to alert you to exposure before it creates injury.


E: In most cases are you looking for over-the-counter, out-of-the-box solutions or bespoke devices? Where is the software being made?

R: I review what’s in the market and what’s in development. I try to stay abreast of what’s available so that I can help clients make the best and most informed decisions about how to reduce exposure. There are always several intervention options that could have an impact, so I usually demo and then pilot test the options that fit the particular exposures but also their organization structure and culture. So, I’m always looking to kick the tires on everything around the market.


E: I imagine biosensors come in every form factor at this point. Is it one sensor per device or are you testing multiple metrics?

R: Let’s take posture monitoring as an example, which is huge in workers’ comp because 30-50% of a company’s losses are from strains. Everyone wants to work on musculoskeletal disorders, which also happen to be the most expensive loss type. Inclinometers which measure posture are great because force, repetition and posture are the lead drivers of strain and sprain injuries. You can do heavier work in the power zone between your shoulders and hips, but outside of neutral posture a task becomes more egregious to your body.

Many companies are doing posture monitoring; some are focusing on the upper extremities, some on the low back. Several biosensor companies have produced very good software programs to go along with the inclinometers, showing not only when someone is out of neutral posture but also how many times a day that person is out of neutral posture, for how long, at which tasks, or what times of day, etc. Some biosensors give an automatic response to the employee (like a buzz). That can be good or bad. If I can’t change my posture because the task is set up so that I have to bend a certain way, the buzzer is going to be continuous and become really annoying. That’s where I would take the data to management and operations and say: Here’s Joe and Mike doing the same job but Mike can’t handle the postures. Why? Because he’s a little older and can’t bend as well at the knees. So, posture monitoring without the dashboard is not as effective. The better the dashboard, the better data we have and the more opportunity we have to provide valuable interventions to the physical task.


E: Can the intervention involve changing the way the task is done?

R: Yes. In fact, we can even link biosensors through a software program to a camera, so that as a person moves, we can see both the physical video and an overlay of what’s going on with his or her posture and force over time. While seeing the person do their task in space and time, we’re capturing their force and posture. That becomes really powerful. We can do that over time, creating a dashboard for different tasks, and then give an employer a prioritized list of the most egregious tasks, where high force and high repetition are most likely to generate a musculoskeletal disorder. So, biosensors with a dashboard and video overlay are very powerful in exposure documentation.


E: Can you talk about some of your recent biometrics and exoskeleton projects?

R: Well, anybody familiar with meat processing knows that it’s a very high endurance, high repetition task impacting the upper extremities and back. It’s static stress on the legs, leaning, twisting and bending at the waist, and moving your arms to process meat. Every part of the body is impacted; the repetition is so intense that you’re moving a carcass every 2 seconds. You’re constantly moving, standing in one place doing the same motion over and over, and you’re usually working a 10-hour shift, 6 days a week. Operations, safety and HR know it’s a difficult task but to change the process is very expensive. You might have to move the conveyor circling the entire plant or slow it down, which operations won’t like. Or, you’re going to have to build adjustable stanchions for people to stand up on. Oftentimes in fixed manufacturing plants, it’s very difficult to change the physical process, so we look at other interventions. The biosensors give us data on where the most difficult task/positions are and where management can spend their nickels to make the best impact. You can give them engineering solutions but if they don’t have the money for re-engineering there are alternative solutions like endurance and fatigue management or job rotation, or even just ongoing stretching throughout the day. You mitigate the exposure if you can’t eliminate it. We look for engineering solutions first, but older plants especially have a hard time putting those automation or engineering changes in place.


E: How are you measuring the ROI of the different solutions you’re implementing? What are the KPIs you’re looking for?

R: Primarily, I look at two things when it’s workers’ comp-related: Loss frequency rate: The number of injury accidents per hundred employees (for example, how many strains and sprains we have per task before and after a solution is implemented) and average cost of claim: How does that cost change after the solution is implemented? We try to reduce both frequency and severity of loss.

Here’s a good example: One 24-hour plant of 400 employees had 50 visits to the nurse everyday looking for splints, gauze wraps, and other assistance. You know that the more times people are going to the nurse, the greater the likelihood you’ll have a claim eventually. We implemented endurance / fatigue solutions and then looked at the number of individuals visiting the nurse and in some tasks the number dropped by 80%. That’s telling because it takes a while for the claims numbers to mature enough to tell you statistically significant results. If I have a short change over time, is it just that introducing the solution made everyone more aware? 18 months is about where you have to be to really see a material change in losses. So, we look at other metrics like online perception and symptom surveys. I’ve used therapy to reduce endurance and fatigue injuries and after each session, we give a quick survey asking how the person felt before and after the fatigue management program. We can then see if we’re going down the right road and match up the results to the loss analysis in the future.


E: RFID devices, body-worn (biometric tracking) wearables, and exoskeletons—which category is most mature and deployable today?

R: Posture monitors. The inclinometers and GPS are the most robust and have some of the best software. RFID is good but you have to understand what the exposure is and what end result you’re trying to get to. RFID chips are really good in environments like construction, where it’s noisy, dark and dusty and vision and hearing are impaired. RFID chips give me another sense to avoid exposure. It can also be used for equipment or where people are working very remotely, to see where somebody is working in a plant and where they’ve been. But posture monitors are probably the most robust in terms of software because, again, everyone’s trying to mitigate the strain and sprain injuries. Industrial hygiene (IH) exposure doesn’t have the same frequency of loss as strains and sprains and has been controlled very well over the last 20 years; it’s gotten a lot of attention and there are so many good programs in place.


E: Is ergonomics slightly newer?

R: Ergonomics has been developing since the mid-80s, but it’s interesting that we haven’t found a silver bullet solution, so we’ve done a lot of training. Office ergonomics got a lot of attention. ‘Ergonomic’ became a buzz word and a marketing ploy, and now a lot of equipment is considered ‘ergonomic.’ For example, you can buy a snow shovel that’s “ergonomic”, but the actual exposure to the individual hasn’t really changed. Carpal tunnel syndrome was huge in the late 90s and early 2000s, then the Mayo Clinic and other studies said that the aging workforce is driving CTS more than typing. Today in the workers’ comp arena, an individual’s physical condition can be as much a factor in injury development as the workplace exposure. The comorbidity or illness can make a simple injury so much more difficult to diagnose and treat and this is why wellness and ergonomics need to be considered together. Wearables are helping us communicate exposure to the operations managers who usually hold the intervention purse strings. Ergonomists haven’t done a great job of this in the past, but the biosensors give us data on an individual or task basis that is very telling for operations, human resources and safety teams.


E: How have employees received wearables? What has been the feedback? Are there privacy concerns and how are you dealing with that?

R: A lot of the biosensors are applied directly to the skin and managers are very skeptical or skittish about that. So, in looking at which wearable is going to work for a company you have to consider the unions, female workers, people that don’t speak English etc. You have to think about having an interpreter, if someone will have an allergy to the spray or adhesive used to attach the biosensor… What if you have a lot of hair on your back? Part of my focus is always communicating these considerations to the risk manager: Given the exposure model they face and the loss profile they have, which tasks are driving the losses, what’s the exposure model for the people doing those tasks, and what are the right biosensors to use to fit their organization’s culture.


E: Are they more receptive if the sensor is in a bracelet?

R: You get better, deeper data especially from a force standpoint if you can attach something to the skin. If you can’t you have to use a halter monitor around the chest or a belt-worn device, something on the biceps if upper extremities are the issue, a bracelet on the arm etc. That’s why it’s important to know the loss profile and exposure model for the risk before adopting a wearable product–what tasks are driving loss and what options the company is willing to consider for solutions.


E: What is your hope for the future as for how wearables are developing? What’s a big development you’d like to see?

R: Right now, biosensors are really good at looking at exposure, giving us a dashboard and helping us come up with solution options. Of course, you need to know what’s available and understand the organization culture; but we’re not using biosensors to their full effectiveness in the hiring, and screening process or the post loss injury management.  In WC, early objective medical diagnosis is critical to managing loss costs especially with strain and sprain injuries, and biosensors can be a substantial benefit in that area – including developing telemedicine programs.  We’re also not always closing the loop between risk management, safety, HR and operations in terms of exposure understanding and the implementation of interventions. Consider how many workplace tasks are developed with the idea that there will be one, two or three injuries per year in that task? The answer is none, but we accept those types of metrics as part of the cost of production. We’re collecting good loss and exposure data but not integrating safety intervention into process the way we do with quality. Biosensors give me the detailed exposure information I need to express business and human cost and help qualify the rationale for the interventions needed to reduce exposure. If I can provide detailed documentation of exposure, I can communicate better to Risk Management so they can do a better job of funding exposure reduction solutions and provide the insight for stronger diagnosis, treatment and return to work practices if a loss occurs. You’d be amazed how many loss profiles show repeat injuries, which get exponentially more expensive.  Biosensors can therefore have a significant impact in all three areas of the WC exposure control model:  Hiring, Screening and Deployment; Prevention and Training and Post Loss Injury Management…

 

Photo credit: Lara de Oliveira Corselet dB via photopin (license)

Ron will present a case study at the upcoming 6th Enterprise Wearable Technology Summit on September 19.

Ditching the robot, training manual, pills and stock room for XR & wearables

4 Recent Use Cases of

Wearable Technologies in Enterprise

 

When a worker in a robotic suit is better than a robot – Boeing

Industrial enterprises have been experimenting with robotics to replace humans in physically strenuous and repetitive tasks, but there are certain complex tasks that cannot be automated. One such job that only a skilled human can perform is wiring a Boeing 777. Boeing has received a lot of attention for using Augmented Reality in this area of assembly. In addition, the aerospace giant has been testing industrial exoskeletons for this process—work that is both too complex for a robot and poses a risk of injury.

Boeing sees the sweet spot for exoskeletons in cases where the safety risk cannot be designed or automated out of the process. Installing overhead electrical wiring certainly qualifies, so Boeing technicians may ultimately use both smart glasses and robotic suits for heads-up instructions and fatigue prevention. Though the company has yet to deploy exoskeletons on the factory floor, pilot programs are helping Boeing determine which models are best for which jobs. Moreover, exoskeletons have become more realistic since 2012, when Boeing began evaluating the technology, with several startups now offering lightweight devices under $5,000.


Teaching a new dog old tricks with XR – Honeywell

A major driver of industrial AR/VR adoption is the skilled labor shortage. As baby boomers retire and leave the workforce, the generation of workers replacing them tends to change jobs frequently. For organizations facing a critical “information leak,” it’s a waste of resources to train a millennial for a technical role he or she will move on from in a few years.

Honeywell, a multinational engineering, industrial and aerospace conglomerate, is hoping to reach millennials and close the skills gap with mixed reality. The Honeywell Connected Plant Skills Insight Immersive Competency is a cloud-based simulation tool that uses Microsoft’s HoloLens to simulate various training scenarios for Honeywell’s C300 controller. The solution allows new employees to safely train for activities like cable and power supply failure; and measures the training’s effectiveness on plant performance through data analytics.

In testing, this method of interactive, on-the-job training improved skill retention by up to 100% compared to passive, classroom-like learning and reduced training time by up to 150%. For a generation familiar with digital content and interactivity in education; sheets of paper, check boxes, etc. don’t work. Honeywell understands that with the boomers’ exodus, the old systems of industry need updating to better align with millennials’ lifestyles.


Not another drug cocktail – The Travelers Companies

The insurance company is collaborating with Cedars-Sinai (healthcare organization), Bayer (pharmaceutical company), appliedVR (creates Virtual Reality experiences for healthcare patients), and Samsung to test a non-pharmacological “digital pain-reduction kit” for managing workplace musculoskeletal injuries. The kit consists of a Samsung Gear VR headset, a Samsung Gear Fit2 wearable fitness tracker, therapeutic VR content powered by biosensors (appliedVR solution), and a nerve stimulation device by Bayer for relieving lower back pain.

Recent research led by Travelers, Cedars-Sinai and appliedVR demonstrated that VR can reduce pain in hospitalized patients and provide an alternative to opiates. The goal of the new clinical study is to “improve outcomes for injured workers by leveraging state-of-the-art technology.” Travelers is interested in discovering new, drug-free solutions for pain management to help its customers support injured employees, lower the chance of opioid addiction, and reduce medical costs.


The perfect blend of online and store shopping – Macy’s

Though it sounds counterintuitive, the department store chain plans to use both virtual reality and e-commerce to improve its brick-and-mortar sales. Macy’s announced it will bring VR furniture sales tools to 50 stores by the summer, with the vision of offering immersive shopping technology in as many of its stores as possible in the future.

Macy’s VR Showroom is powered by Marxent’s 3D Cloud service. Customers use an iPad to add and move furniture around a room; and, once satisfied with the arrangement, an HTC Vive headset to experience the finished, fully furnished space.

In the pilot phase, VR allowed users to feel more confident with their furniture choices. Not only did they buy more but they bought items that Macy’s carries but may not be available on site at every store. That’s the beauty of VR in furniture, automobile and other high-end sales—you can sell more goods with less physical retail space to showcase it. After rolling out the solution, Macy will be able to offer furniture departments in more locations.

 

The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.


Augmented World Expo (AWE,) the world’s largest conference and expo dedicated to Augmented and Virtual Reality, is taking place May 30-June 1, 2018 in Santa Clara, CA. Now in its 9th year, AWE USA is the destination for CXOs, designers, developers, creative agencies, futurists, analysts, investors and top press to learn, inspire, partner and experience first-hand the most exciting industry of our times.

Manufacturing 4.0: Checking In with Expert Peggy Gulick of AGCO

A true enterprise wearable tech pioneer, Peggy Gulick, Director of Digital Transformation, Global Manufacturing at AGCO Corporation, spearheaded one of the most successful use cases of Google Glass in enterprise to date. Where others saw challenges, Peggy and her team saw opportunities to turn a device that was then (2013) struggling to find a purpose into a powerful lean manufacturing tool. We last interviewed Peggy in July of 2016, before she first graced the EWTS stage. Since then, AGCO has become a poster child of Glass Enterprise, the second generation of Google Glass developed with the input of enterprise visionaries like Peggy; and Peggy herself has become a star speaker, her story undoubtedly inspiring many others. Below, Peggy answers our questions about the state of manufacturing today:

 

BrainXchange: What are the greatest challenges faced by manufacturers today?

PG: All manufacturers that I have spoken to seem to face similar challenges with rising employer costs (many related to healthcare) and the need to reduce operational costs while projecting longer-term strategic plans. In addition, the expectations on employers by employees and the communities that they exist in have changed. Employees expect more from their employers, including a sense of purpose. Communities expect both social and environmental contribution.

In the midst of this, there is a gap in qualified labor and the high-tech skill sets required to meet new operational budgets and strategic plans to increase quality, reduce time and cost to market.

Automation, industrial revolution 4.0, Internet of Things and big data are all being touted as responses to these shared challenges, yet most organizations have not figured out how to incorporate them into current business processes. Although these new technologies can provide relief to manufacturers, they continue to face perception challenges, identified as replacing rather than augmenting humanity.


BrainXchange: What are the effects of automation and big data in manufacturing?

PG: Currently, there are two types of companies benefitting from big data. One is, of course, big data companies, ranging from expanded infrastructures to storage, management, processing and analytics of massive amounts of collected and stored information. The second is the strategic few organizations that have found ways to incorporate the data into problem solving and to deliver the right information to the critical point of decision making. By treating big data and automation as dependent and collaborative solutions, both as drivers of continuous improvement and lean manufacturing processes, we have been able to determine the elements that are most likely to impact outcomes that matter the most –to our product and process quality, productivity and safety. Big data, unless transformed into actionable information, is meaningless.


BrainXchange: Is AGCO experiencing a “skilled labor crunch?”

PG: Yes, but we are addressing it through investment in our employees, both current and potential (apprentices). Mechatronics, assembly academy, scholarships and on the job training combined with a work environment that allows employees to contribute and feel a sense of purpose has allowed us to retain and recruit successfully. Our employees are motivated by the organization’s concern for quality products/processes and employee safety, not cost-reduced workforces.


BrainXchange: How might smart glasses and Augmented Reality help address some of the above challenges?

PG: Smart glasses and augmented reality have been deployed in our manufacturing operations to further our continuous improvement efforts across the site. The use of wearable technology helps eliminate motion, over-processing, defects and even transportation. Excessive travel to workstations to retrieve work instructions and bills of material is eliminated. Defects are minimized due to comprehensive (pictures, videos) and easy-to-access to work instructions. Our plant makes highly complex, low-volume agricultural equipment. Wearable tools help minimize over-processing caused by the need to rework due to misguided assembly. When workers can do their job smarter, faster, safer, it resonates throughout the entire culture. As we realize labor crunches, it is more and more important for companies to offer the tools and training required to create, grow and retain their employees. Smart glasses has helped us to do that.


BrainXchange: What tools do AGCO workers currently use to do their jobs? How are new workers currently trained?

PG: All of our assembly and assembly quality gate employees attend 40 hours of Assembly Academy followed by 40 hours of Lean Work-cell training. In addition to reading blueprints and interpreting supplemental information, assemblers must be proficient at hand, power and assembly tools. Since employees are now expected to use wearable tools including smart eyewear (Google Glass) to access work instructions and quality checklists, wearable tools are introduced immediately in the learning academies.  Wearable tools not only inform but also capture and flow pertinent information (including pictures, text and video) for non-conformance issues and missed thresholds.

It was critical to the success of wearables to acknowledge that all employees are not equal in training and skills. As employees’ skills mature, specific to operations, our wearable applications allow for personalized levels of instructions, tailoring them based on algorithms of training and experience.

The wearable tools themselves are easy to implement and support. Most employees are excited to wear the technology and realize the benefits quickly.


BrainXchange: Where do you see the greatest opportunities for smart glasses in the manufacturing plant?

PG: Our product design team finds great value in virtual reality glasses. Not only do they broaden the ability for a team to “see” what others are thinking, but they allow design teams to remotely interact, all in virtual glass, all seeing the same product and projected design strategies.

As a problem-solving organization and culture, we have weighed the value of wearable smart glasses in many areas, including welding, paint preparation, assembly, quality, technical services, material management and even plant tours. The first thing that we have discovered is that the projected value of replacing current tools, whether it be paper work orders or terminal work instructions, with smart glasses is 2x what we initially thought. The results have been so beneficial in some areas that we have retested, thinking it was a mistake. It is important to note that every pilot we have conducted has been in response to a defined problem. And, after 5 whys, fishbones and cross-functional involvement, sometimes even a kaizen, smart glasses are a part of the proposed solution with metrics associated. Knowing that smart glasses are a lean tool, and not an industry requirement or cool factor, we have reported 30% reduction in processing times, 50% reduction in amount of time employees train on the job (new hire and cross functional) and reduced quality and safety incidents that we are still calculating. The greatest value for the glasses has been in assembly and quality, both needing easy and quick access to hands-free instructions. As a manufacturer of complexly configured products, we have discovered that training by smart glasses is the grand slam. New product launches, multi-operation and new hire training are easily administered and audited for success.


BrainXchange: How do smart glasses further lean manufacturing?

PG: Simple. Lean is all about waste elimination. Smart glasses, when implemented for the right reasons, reduce waste. The use of wearable solutions was discovered as we did what we do best every day–solve problems (4873 problem solutions implemented by employees in 2016.)

Introducing Google Glass to our manufacturing floor was not intended as disruptive technology or even competitive advantage. They were introduced as solutions to make employee’s jobs easier and safer while driving higher quality to our product and our processes. In the end, we have accomplished both.


We are delighted that Peggy will be speaking again at EWTS 2018 this October, and cannot wait to hear how AGCO’s Google Glass success story has progressed. 

 

The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.


Augmented World Expo (AWE,) the world’s largest conference and expo dedicated to Augmented and Virtual Reality, is taking place May 30-June 1, 2018 in Santa Clara, CA. Now in its 9th year, AWE USA is the destination for CXOs, designers, developers, creative agencies, futurists, analysts, investors and top press to learn, inspire, partner and experience first-hand the most exciting industry of our times.

 

Photo credit: Google X

Virtual Reality and Production Ergonomics

The big automobile manufacturers have been using Augmented and Virtual Reality to both design cars and sell them in dealerships; but there is a step in between design and sales where AR/VR can be employed and that is production. I’m not talking about auto workers wearing AR glasses to view assembly instructions on top of vehicles as they come down the line; but rather using VR to simulate the assembly process itself and ensure the safety and comfort of those who carry it out.

The ergonomics of each step of putting a machine together – how a worker must move throughout the assembly process – is not something we often think about, yet poor production ergonomics are a source of inefficiency and can have a high physical cost for employees (+ associated financial costs for the company.) It is difficult to assess the ergonomics of a process like automobile assembly or the building of other large-scale manufacturing products by examining a schematic or onscreen model. And once the vehicle or machine hits production, it becomes too expensive to make major changes that don’t affect product performance or pose a serious threat to the end user.

Virtual Reality can help optimize production by detecting problems that might arise during assembly. In a virtual world, you can test out and analyze different scenarios and methods as they would occur in the real working environment without staging physical mock-ups or putting actual workers at risk. You could assess the dimensions and kinematics/operation of tools and equipment in relation to the human worker; adjusting these to the biomechanics and physical attributes of the assembly worker, making sure all necessary tools are within reach, and that their use will not require unsafe leaning, twisting, reaching or bending. You could even recreate the entire manufacturing plant to factor in (and ideally position) elements that might impact the process you’re designing, including other production cells, moving vehicles, and even lighting and air conditioning.

Essentially, VR enables manufacturing experts to “rehearse” assembly from the human worker’s perspective, and real enterprises are seeing benefits like increased efficiency and fewer injuries. Read on to learn how Ford and John Deere are using Virtual Reality and motion detection technology to make life easier and safer for assembly line workers:


Ford

In 2015, Ford reported that it was outfitting employees with Virtual Reality headsets and motion capture body suits to refine the design of future production lines. Ford researchers simulated the assembly process for upcoming vehicle models, running through the steps with a real person and 3D-printed (partial) mockups in a virtual workstation and using the data to spot production challenges, assess physical risks, and design ergonomic tooling years in advance of actual assembly.

Motion capture revealed how an employee would need to move to complete an assembly task; what degree of muscle strength, joint strain and body imbalance would be involved. A head-mounted display was used to study the feasibility of tasks for workers on the line and 3D printed models to look more closely at things like maneuvering and using tools in tight spaces.

The results influenced production decisions and affected vehicle design components and parts, helping to reduce the number of injuries by as much as 70%. 


John Deere

The machinery manufacturer probably best known for its tractors uses Virtual Reality in the design process to understand and optimize both how future operators will use its machines and how employees will assemble them.

Engineers tasked with determining assembly feasibility of new machines conduct VR reviews during product development. In these reviews, an operator wearing a VR headset and using motion-tracked tools is fully immersed in a virtual work environment. The evaluators watch on, seeing the user’s view on a screen, to assess things like whether the worker has enough visibility; whether her body is interfering with the machine; whether she has to assume an awkward posture or reach for a tool during the assembly; whether the tool fits where it has to go (in a particular area or part of the machine;) etc.

In this way, high-risk processes are identified and machine designs tweaked before they can cause real injury in the factory and delay manufacturing.


Manufacturing design engineers obviously think about a product’s look, and we know they focus on user experience; but the way new products are designed also affects how they are built. Designers must therefore take into consideration the actual assemblers and any potential safety pitfalls or impracticalities they might encounter in putting a product together. Virtual Reality presents a powerful tool for simulating what it takes to build a machine and refining a design to make assembly itself more ergonomic and streamlined, thereby building safety and a layer of efficiency right into the production line.

 

The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.


Augmented World Expo (AWE,) the world’s largest conference and expo dedicated to Augmented and Virtual Reality, is taking place May 30-June 1, 2018 in Santa Clara, CA. Now in its 9th year, AWE USA is the destination for CXOs, designers, developers, creative agencies, futurists, analysts, investors and top press to learn, inspire, partner and experience first-hand the most exciting industry of our times.

 

photo credit: justraveling.com Virtual Reality at SXSW 2017 via photopin (license)

Using HoloLens for Design and Asset Visualization

Just as there is a disconnect in designing three-dimensional structures and spaces on two-dimensional screens – and in executing and arranging 2D designs in real space – there is a disconnect in taking multiple data sets and real-time data streams in different formats and attempting to identify patterns and insights to apply in the real world. Architects and designers have been first-movers when it comes to using Augmented and Virtual Reality technologies in the design process; but there are other professions that call for digesting complex information, understanding complex situations and environments, and planning with moving parts. Below are three enterprise scenarios in which Microsoft’s HoloLens Mixed Reality headset is used as a design and asset/data visualization tool:


Workspace layouts (office space, shop floor, job site, store…)

Polamer Precision, an aerospace manufacturer, has been using Microsoft’s HoloLens Mixed Reality headset to map out its manufacturing “cell” layout. In Mixed Reality, users can test out positions for workstations and tooling and ensure that forklifts and other equipment will have room to operate. Imagine walking into a real-world environment like a job site – perhaps the site changes with every job – and having the ability to view holograms of the machines, vehicles, tools and human workers that will need to be brought on site to get the job done. It takes the guess work out of the planning process and helps avoid costly delays.

Stryker is another company using HoloLens in this way—in hospitals. The medical device company sells equipment for hospital operating rooms, helping its clients figure out ideal arrangements of equipment to create state-of-the-art ORs.

In a typical hospital, multiple practitioners from different surgical disciplines share a single OR. Figuring out how to install the equipment is not just about fitting all the items into the room; the layout also has to be practical for every doctor (and nurse) that will need to move around and operate there, not to mention safe for patients.

Instead of having all stakeholders physically present to work this out or manually moving around heavy (and expensive) equipment to test out different configurations, Stryker has been using HoloLens and its own By Design software to build and modify possible OR scenarios with holograms. AR brings Stryker’s portfolio of digital 3D assets to life, allowing for better and faster collaboration.


Financial planning

When you hear “wearable tech in banking,” you probably think of contactless payments; but financial services companies are exploring Augmented Reality for wealth management and the trading room, as a tool for interacting with large quantities of complex data and advising clients remotely. In Spring 2016, Citi – working with 8Ninths – became the first bank to reveal a proof of concept for AR in stock trading.

In the YouTube demo, you see a Citi trader no longer confined to the trading desk. He checks the news on the traditional 2D monitors that flank his workstation before putting on HoloLens. Using the holographic trading tools via voice commands, he “sees and quickly assesses a dynamic, 3D visual snapshot of what’s happening in the market right now.” Noticing a lot of activity in one sector, he has an idea for a trade that he shares remotely with a client.

In Mixed Reality, the Citi trader is better able to monitor, analyze and manipulate real-time market news and trends, bids and offers, etc. unimpaired by a lack of screen real estate. It’s digital downsizing—from 6-8 monitors composing a typical trader’s workspace to two monitors and a headset opening up an entire interactive trading world.

A year after Citi’s demo, FlexTrade, a provider of execution and order management trading systems, announced its augmented reality trading application for HoloLens. The app, called FlexAR, offered “a new way of visualizing and presenting trading,” with an interactive order blotter, trade ticket, and charting. Traders could make faster and better data-driven decisions by viewing and interacting with stock prices, volume, profit and loss, and other complex data in a virtual space. AI would add contextual information, identifying key elements like price or volume changes in real time and automatically bringing up information about a specific company or trade in consideration. Interestingly, FlexTrade found Virtual Reality too immersive for the use case.


A holistic view of enterprise operations

Air transport IT provider SITA, with Helsinki Airport, has been exploring the potential of Mixed Reality for airlines and airports. In a study, SITA Lab simulated the airport operational control center (AOCC) in MR, interfacing multiple data sources to produce a unique and dynamic, 3D view of Helsinki Airport’s operations. Operators wearing HoloLens could see and interact with dashboards of real-time operational data like passenger location, security wait times, flight statuses, gate information and retail analytics; correlating events to gain insights for better managing the airport’s operations.


These examples show possibilities for HoloLens beyond a machine overlay for maintenance and training. In ever-changing environments like busy airports and hospitals, Augmented Reality is superior to the 2D tools we currently use to view (and make decisions about) complex situations with a lot of moving parts. And for managing the complexity of data that makes up the world of banking and finance, AR is an unparalleled medium that turns data points into digital content with which users can engage.

 

The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.


Augmented World Expo (AWE,) the world’s largest conference and expo dedicated to Augmented and Virtual Reality, is taking place May 30-June 1, 2018 in Santa Clara, CA. Now in its 9th year, AWE USA is the destination for CXOs, designers, developers, creative agencies, futurists, analysts, investors and top press to learn, inspire, partner and experience first-hand the most exciting industry of our times.

photo credit: dalbera HoloLens, le casque de réalité mixte de Microsoft (Futur en Seine 2016 Off, Paris) via photopin (license)

2017 was the year of store closings–Can the Internet of Things solve retail’s woes?

The Internet of Things (IoT) is a network of Internet-connected objects that have been made “smart” with embedded sensors. These devices collect data about the physical world around us, about processes and the health and behavior of people and machinery—data which is then interpreted and shared with other machines and people via cloud-based software platforms. In retail, IoT solutions can improve in-store operations, optimize supply chains, deliver better customer experiences, and generate new revenue streams.    


2017 was the year of store closings. By year’s end, major retailers will have closed or announced plans to shutter over 8,500 stores. This in addition to the many brick-and-mortar retail bankruptcies, as e-commerce giants and fast fashion brands threaten traditional retail business models. How to solve retail’s woes? Is the future bleak for the industry? Has online shopping won or might the physical store survive with a little TLC and IoT?

Consumers today expect the shopping experience to be seamless across all of a retailer’s channels. That includes the brick-and-mortar store, e-commerce site, mobile app and even telephone customer service. So, when we talk about the Internet of Things in retail, we’re not just referring to the connected store of the future but rather to a system of sensors linking every aspect of a retailer’s business far beyond “buy online, return in store.”

What kinds of things can be made smart in retail? You may be familiar with beacons, RFID tags, NFC payments, and QR codes. Other retail technologies include in-store infrared foot-traffic counters, source-tagged SKUs, mobile device tracking, digital signage and kiosks, and classic video surveillance. There are electronic shelves that detect when inventory is low; proximity sensors that could be synced with digital coupons or AR cues activated via shoppers’ smartphones; and, potentially, interactive digital signs that tailor promotions to the person standing before them.

Sensors are being incorporated today into product packaging to monitor the quality of perishable goods; and digital price tags are enabling dynamic pricing, where item prices are changed in real time to reflect the most current trends in demand, inventory level and other data (i.e. surge pricing.) And on the horizon are self-scanning and self-checkout by smartphone, as well as robots that could aid in areas like product assembly, stock replenishment, and hazardous or heavy materials handling. Robots might even assist in stores by responding to spoken inquiries (voice recognition, AI,) helping shoppers to browse and locate inventory, and suggesting products based upon customers’ shopping histories. Imagine if a robot knew the last thing you left sitting in your online shopping cart and could inform you of a new reduced price or special sale for that item as soon as you entered the store.  And, of course, wearables worn by sales associates and managers, along with those on the wrists of shoppers, are another potential end point and data collector in the connected store of the future.

Let’s see how all of these “things” or potential parts of an Internet of Things strategy could impact retail operations:


In-Store and Supply Chain Operations

The Internet of Things promises to help retailers increase efficiencies through better visibility into their operations and supply chains. Inventory tracking (using RFID, smart shelves, and other sensors) is a good place to start—gaining real-time knowledge of where inventory is located and in what quantity and condition. By knowing exact stock levels, exactly when a replenishment delivery will arrive and how up-to-date goods are; retailers and their employees can make better on-the-fly decisions to meet customer needs and expectations.

Tracking inventory throughout the supply chain would help avoid out-of-stock situations that hurt customer satisfaction and result in missed store sales opportunities. Data collected in the cloud from different sources could be analyzed and delivered to the right agents like warehouse managers or store staff via mobile apps. Inventory tracking would also aid in handling unexpected surges in demand due to unscheduled events, situations in which a retailer must be highly responsive by immediately attaining and acting upon stock availability. Or, you could avoid the issue altogether by linking smart shelves in stores with a warehouse management system to automatically reorder products when store inventories reach a certain level.

Another means of reducing missed store sales is demand-aware fulfillment, where warehouse automation and robotics are driven by online and in-store shopping demand and inventory levels. By monitoring sales opportunities in real time and automating the movement of goods through the supply chain, you can reduce the chances of missing a sale due to the customer’s desired item being out of stock.

In addition to preventing missed sales, (RFID) inventory tracking can help decrease product loss and shrinkage, and increase accountability at all operational levels. For instance, an IoT solution featuring smart shelves, source-tagged SKUs and security footage would “know” whenever an item is taken off a shelf (instantly updating inventory records) and could raise a red flag if the item weren’t subsequently paid or accounted for. Goods can also be lost due to environmental factors. Tracing those goods is important for health and safety reasons, but an IoT-savvy retailer would be able to intervene before items “go bad” by using sensors to monitor variables like heat and humidity that impact perishable goods. You could move things around, prioritize one shipment over another, etc. and create an audit trail to identify the cause and/or responsible party.

The product is important, as is the employee who sells it—employees are key assets that can be tracked to help a retailer understand how to best manage them, ensuring there is always an employee prepared and in the right location to serve a customer. IoT solutions can reveal whether store associates are responsive to shoppers’ needs in addition to helping them be so by analyzing employee activity data (movement, number of sales, etc.) against factors like store traffic and customer demand and by giving them the information to work efficiently. For instance, employees could use smartwatches or smart glasses to look up real-time stock and other product info on the sales floor, never having to leave the customer’s side to walk over to a computer or “check in the back.” Shoppers needing assistance might also use an app to summon help in stores, with an automatic wearable alert sent to the nearest salesperson.

What else can IoT technologies do to improve store operations? How about managing energy consumption, a major expenditure for large stores, using smart lighting and thermostats? Or enhancing workplace productivity? An IoT solution could tell you how your employees should spend their time (helping customers or doing operational tasks) and where to place them on the floor by analyzing current and past data; including high-traffic areas, seasonal shopping trends, customers’ shopping histories and digital wish lists, delivery and inventory audit schedules, etc. And predictive maintenance isn’t just for manufacturing plants; retail stores have equipment, such as grocery store refrigeration units, which can be sensor-ized to monitor power consumption, temperature, etc., reduce product loss, and ensure food safety. With electronic inventory tracking, suppliers can be automatically notified when stock needs to be replenished and digital coupons can be offered to shoppers to improve turn on aging stock.


Customer Experience and New Revenue Streams

Another major application of the Internet of Things in retail is customer service—helping customers to shop more easily and empowering employees to better serve them. In the brick-and-mortar store of tomorrow, (wearable) contactless payments and automated home replenishment could provide ease; robots or even smart shopping carts could help customers navigate aisles; digital marketing could be customized for individuals and different regions; optimal store layouts could be designed in Virtual Reality; and shoppers could try on clothes and select automobile options virtually.

IoT solutions can help retailers connect the physical store experience to shoppers’ digital lives and thereby make store shopping more engaging and seamless. As soon as a customer enters the store, for instance, customized offers and product recommendations could be sent to his or her mobile device based upon that person’s online shopping and browsing history, app usage, and possibly activity level (from a wearable fitness tracker.) This is proximity marketing; it can be automatic (BLE beacons) or personalized (NFC tags, QR codes and now also AR cues) by leveraging the same type of rich data and advanced analytics brands use to drive e-commerce and consumers’ own devices.

Shoppers are more likely to download a retailer’s app (and give up their location) if it gives them access to those kinds of promotions or even the power to scan items and call up product reviews, find their size at other store locations…anything that makes the in-store experience as frictionless as the online one. That same rich data and instant product info can be leveraged by store associates who’ve never had access to it before; to identify loyal customers, incentivize undecided shoppers, upsell products and provide overall better service.


The possibilities for the Internet of Things in the retail industry are great: Retailers can use IoT technologies to collect real-time data on shopper behaviors but a decision management system is necessary to (automatically) orchestrate action, whether that’s directing an associate (via wearable) to assist a customer or provide relief in a really busy area of the store; remotely changing prices or tailoring an in-store display; or having warehouse workers reposition inventory elsewhere in the supply chain.

 

The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.


Augmented World Expo (AWE,) the world’s largest conference and expo dedicated to Augmented and Virtual Reality, is taking place May 30-June 1, 2018 in Santa Clara, CA. Now in its 9th year, AWE USA is the destination for CXOs, designers, developers, creative agencies, futurists, analysts, investors and top press to learn, inspire, partner and experience first-hand the most exciting industry of our times.

 

photo credit: arbyreed Cold Drinks via photopin (license)

Virtual Reality in Enterprise? Big Brands Say Yes

Sources say Augmented Reality will be big for enterprise; Virtual Reality, on the other hand, is expected to have mainly entertainment/gaming applications. The following use cases of wearable technology may seem like consumer use cases, but they’re not. They’re instances of major enterprise organizations evaluating new technologies and finding ways to incorporate wearable tech – not AR glasses but this time VR headsets – to propel their businesses forward. The following companies are not putting wearable devices into the hands of employees who then interact with customers; they’re putting devices – however atypical and modest – into consumers’ hands to build their brand and generate revenue.
Continue reading “Virtual Reality in Enterprise? Big Brands Say Yes”