Exoskeletons Get Real: The Ultimate Wearable Technology?

Exoskeletons are nothing new and far from science fiction; in fact, researchers began developing exoskeletons for military use as early as the 1960s. For the last decade, engineers have been exploring exoskeletons designed to augment human strength and other abilities for military, medical and industrial purposes. Over time, these devices have become less clunky and expensive (from over $100k to under $10k in many cases) as well as more specialized and powerful.

Arguably, exoskeletons, not AR/VR, are the current star of enterprise wearables. Today, real companies with the funds to do so are transforming workers’ productivity and safety by introducing even just a few exoskeletons on the job site or factory floor. Ford recently made the largest order of industrial exoskeletons to date, with ABI Research expecting the market to rise from 67.29 million to $1.76 billion by 2028—exoskeletons have arrived.


Exoskeletons find their sweet spot

Combining the power of robots with the intelligence and adaptability of humans sounds great, but the execution – designing machines that conform to how we’re shaped and the way we move – is challenging. How do you make a machine both lightweight and flexible, adaptable to a wide variety of body shapes and sizes? Nevertheless, there are now tool-holding (partial) exoskeletons, exoskeletons for back support and sitting comfortably in mid-air, and even full-body, sci-fi-looking powered exosuits.

No post about exoskeletons can fail to mention this major driver for the advancement of the technology: Workplace injuries. Exoskeleton development has evolved from military and medical applications to industry, where wearable robotics are finding their sweet spot. Legacy industries like manufacturing that are changing due to automation but still rely heavily on human input; industries like construction and shipbuilding in which productivity hasn’t risen in decades; and industries where awkward positions, repetitive motion, and overexertion are common and employers shell out billions on workers’ compensation are particularly ripe for exoskeleton technology.

Assembling, building, moving… the heavy-duty operations of industry in which workers are required to wield heavy power tools, perform overhead tasks, stand for long periods of time, etc. breed accidents, injuries and long-term musculoskeletal stress. The U.S. alone spends $21 billion on workplace injuries, the price tag of healthcare treatment and lost worker productivity. When you consider that a single rotator cuff surgery can take out a worker for up to seven months and cost employers up to $56,000, wearable robot suits don’t seem so crazy. And major industrial players and startups alike recognize the growing opportunity:


 The market

Where once there were a handful of companies working on industrial exoskeletons, the exoskeleton market today has become quite crowded. Here are some of the companies developing and selling partial or full-body, powered or unpowered exoskeletons:

Tool-holding

Lockheed Martin

The large defense firm has long been interested in human augmentation, mainly in the military arena. In 2014, Lockheed introduced its first industrial exoskeleton product FORTIS (currently $24,750). FORTIS, a passive (unpowered), lightweight exoskeleton that transfers loads from a standing or kneeling position to the ground, makes heavy tools like a giant power drill feel weightless to the operator. Lockheed also sells the FORTIS Tool Arm ($7,149), which reduces muscle fatigue to allow the use of heavy hand tools for long shifts.

Bioservo

Bioservo’s first commercial product, the SEM Glove contained sensors that detected the wearer’s actions and activated motorized support when needed to grasp objects. Based on the company’s patented SEM (Soft Extra Muscle) technology, Ironhand ($9,250) is Bioservo’s newest product and a successor to the SEM Glove. It supports grip-intensive tasks while collecting data to categorize risky use cases and can be worn under a normal working glove. Bioservo bills Ironhand as “the world’s first soft robotic muscle strengthening system.” The company has signed contracts with General Motors, Airbus, NASA, and others.


Support 

noonee

With the Chairless Chair ($4,360) by noonee, employees can create a comfortable, more productive workspace at any time. The lower-body exoskeleton is designed to prevent back pain for workers who spend a large part of the day standing by essentially allowing the wearer to lock in and sit in mid-air while doing her work. The Chairless Chair debuted on several manufacturing lines and is now in use globally by over 100 companies. 

Laevo 

The Laevo (approx. $2,000) is a passive back-support exoskeleton for workers who have to frequently bend forward and lift objects. It works by transferring force from the upper body through the straps and to the thighs, thereby reducing pressure on the user’s spine and back by up to 40%. Laevo describes wearing its exoskeleton as “just like” putting on “a coat”—it adapts to your posture so the wearer has a lot of freedom of movement.

StrongArm Technologies

StrongArm’s FLx ErgoSkeleton ($298) is a data-driven upper-body exoskeleton with sensors that monitor posture and movement, providing feedback to ensure the wearer conforms to OSHA safe lifting guidelines. The solution promotes good posture and safe lifting by encouraging the user to bend at the knees and pivot instead of twist. The V-22 ErgoSkeleton ($629) adds cords to the FLx model; these loop over the shoulders and attach to a worker’s hands to restrict arm movements in such a way as to automate proper lifting. The passive exoskeleton shifts weight from the weak areas of the body to the user’s legs and core.

SuitX

SuitX has three models of industrial exoskeletons – backX ($4,000), legX ($6,000) and shoulderX ($4,000) – individual modules that when worn together form the full-body MAX exoskeleton. With backX to help with lifting heavy loads, legX to support crouching for extended periods of time, and shoulder to alleviate overhead work; the full MAX system allows wearers to perform lower back-, leg- and shoulder-intensive tasks with less risk of injury.


Full-body

Sarcos Robotics (Raytheon)

Not yet commercially available, Guardian XO is a robust, powered exosuit that’s said to enable the wearer to lift up to 200 pounds without exertion or strain. The XO features “scaled dexterous end effectors” and force feedback, allowing highly precise tasks with heavy tools or components. Sarcos says the Guardian XO and Guardian XO Max are “coming soon,” and the company recently secured its second development contract with the U.S. Air Force. Sarcos has also formed X-TAG, an industry-focused Exoskeleton Technical Advisory Group, along with executives from Bechtel, BMW, and more.

Comau

MATE (Muscular Aiding Tech Exoskeleton) by Comau is a spring-based exoskeleton designed to ease the shoulder muscles and provide lightweight yet effective postural support during manual and repetitive tasks. Designed in partnership with ÖSSUR and IUVO, a spin-off of The BioRobotics Institute, along with input from factory workers; MATE will be available in December 2018.


New entrants

LG

Household name LG is about to unveil the CLOi SuitBot, which looks like a pair of robotic pants and supports mobility by enhancing the power of the user’s legs. The exoskeleton can work alongside LG’s other service robots as part of a more advanced smart workforce scheme, and it uses AI to learn and evolve over time by analyzing biometric and environmental data. LG hasn’t revealed a price.

Ottobock 

Ottobock is a German artificial limb manufacturer whose close competitor Össur helped Comau design MATE. Paexo is Ottobock’s new project, an upper-body exoskeleton that relieves the strain of repetitive overhead assembly work. Paexo has been tested on 30 Volkswagen plant workers and the automaker is considering using Paexo in series production.


The future of manual labor begins now: Use cases

Betting on the promise of wearable robotics to increase productivity and reduce injuries; a number of construction, manufacturing and logistics companies have begun testing and even deploying exoskeletons. Here are some of the more recent use cases:

Lowe’s

Lowe’s employees can spend up to 90% of their day lifting and moving bags of cement, buckets of paint, etc. So, last spring the home improvement retailer teamed up with Virginia Tech to develop a lift-assist exosuit that would make the workday easier. The result: A kind of harness-meets-backpack with carbon-fiber rods running down the back and thighs. The rods flex and straighten when the user bends or stands, absorbing energy that’s then delivered to the worker when needed. During a 3-month pilot, test subjects wore enjoyment-sensing headsets in addition to providing verbal feedback about the exosuits.

The promised benefits are myriad for Lowe’s: Improved customer service (store staff can fetch items for customers), reduced costs (fewer injuries, reduced insurance premiums), and even better recruitment.

Ford 

In 2017, four employees at a Ford plant in Missouri tried out the EksoVest by Ekso Bionics, an unpowered, adjustable exoskeleton vest that can help workers do things like install carbon cans on cars suspended above them at a rate of 70 cars/hour. The United Automobile Workers Union actually paid for the trial to see if exoskeletons could really reduce common injuries among autoworkers.

Ford has been interested in wearable robotics since 2011, particularly for preventing shoulder injuries, which take the longest to recover from. The ROI is there: If one $5,000 EksoVest lasts three years, the cost comes out to 12 cents/hour/employee. That’s around the same price as a pair of disposable gloves and far less than the cost of even just one shoulder injury.

Just last month following 16 months of testing, Ford went into deployment mode, ordering 75 EksoVests for employees all over the world. This is the largest order of industrial exoskeletons ever placed and the first step in Ford’s plans to launch exoskeletons in factories worldwide.  

The EksoVest provides up to 15 pounds of lift assistance and support (per arm) during the overhead tasks Ford assembly line workers perform millions of times a year. Additionally, Ford is testing a motion-tracking bodysuit and camera solution at one of its plants in Spain, with the goal of making data-driven modifications to workstations and vehicle production processes that reduce physical stress.

Boeing

For as long as Ford, Boeing has been experimenting with exoskeletons to address the problems automation can’t solve. Wiring a Boeing 777, for instance, a task so complex only a highly skilled human can perform it, is a perfect opportunity for an exoskeleton. What attracts Boeing to exoskeletons are not only rising insurance premiums but also the possibility of improving the lives of its technicians who train for years to do their jobs and whose absence or retirement would be a hit to the aerospace giant’s productivity.

Though still in the experimental phase, Boeing has been running pilots to match the right exoskeleton to the right type of work and studying years of safety data to see where injuries are most likely to occur. Boeing mechanics in South Carolina have actually gone through training on the EksoVest, as Boeing hopes to roll out the tech to more workers in 2019. Apparently, Boeing employees love the exoskeletons.


Challenges still ahead

For every new type of PPE (Personal Protective Equipment) there is process of adoption, and it’s no different with exoskeletons. The wearable robotics space is evolving fast; prices will continue to fall and the exoskeletons themselves will become lighter and more powerful over the next three to five years, but it takes a lot of testing! A good sign is the interest of the ATSM International, a body that sets manufacturing standards and has created a special committee of 90 organizations focused on exoskeletons and exosuits. Just as walking in areas of a job site without the proper PPE is forbidden, one day workers on construction sites and in warehouses and manufacturing plants will be forbidden to operate tools without the appropriate exoskeleton.

 

The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 5th annual EWTS will be held October 9-11, 2018 at The Fairmont in Austin, TX. For more details, please visit the conference website.


Augmented World Expo (AWE), the world’s #1 AR+VR conference and expo, comes to Munich, Germany on October 18-19, 2018. CXOs, designers, developers, futurists, analysts, investors and top press will gather at the MOC Exhibition Center to learn, inspire, partner and experience first-hand the most exciting industry of our times. Tickets now available at www.aweeu.com.

 

Image source: Digital Trends

Let Your Customers and Workers Choose the Right XR Use Case for You

Here’s a common misconception: The more robotics and Artificial Intelligence (AI) advance, the more expendable human beings become in the workplace.

Although Forrester Research predicts automation will displace 24.7 million jobs by 2027, it’s irrational to fear that robots will ultimately replace all human workers. For as robotics and AI improve, so do technologies for empowering human workers. I’m talking about wearable technologies like augmented and virtual reality headsets as well as wearable robotics (exoskeletons) that enable humans to work longer, quickly train for new jobs, and perform in sync with automation. You could even argue that as automation progresses, human workers will become more indispensable to enterprises—while robots may assume the dangerous and repetitive aspects of work, unmanned technology won’t be able to address every productivity issue or match distinctly human capabilities like human dexterity and imagination.

When it comes to embracing disruptive technology, successful organizations take a “user is king” approach, finding out pain points in the business directly from the source, i.e. workers or customers who are expected to use or benefit from the technology. Whether it’s getting a group together for a brainstorming session, including members of the workforce in the proof of concept stage, or simply encouraging a company culture where employees feel comfortable sharing their ideas with leadership; there is no one better than the user herself to determine where and how to digitally transform.


“Treat employees like they make a difference, and they will.” – SAS CEO Jim Goodnight


Two companies have gone beyond merely asking for user input: KLM Royal Dutch Airlines established a physical hub to foster workers’ original ideas for using emerging technologies; while Lowe’s went directly to the customer, applying “young” immersive tech to age-old home improvement shopping challenges. Essentially, KLM and Lowe’s are letting their employees and customers come up with the use cases in which they’re investing.

KLM Royal Dutch Airlines

In 2016 at its Amsterdam Airport Schiphol East base, KLM Royal Dutch Airlines opened its Digital Studio, a creative space where workers from all areas of the airline’s business are encouraged to come and innovate. Here, employees can put forward ideas on how to use digital technologies like AR and blockchain in their work, and see their ideas fast-tracked into development and then, hopefully, into practice.

The Digital Studio, which currently has room for 200 workers, is based upon Dave West’s Scrum Studio concept of an environment where high-performing teams, physically separated from the main business, can fast-track projects. It’s very hard to change large legacy companies like KLM from within: The larger the organization, the higher the chances of disruptive technologies ending up in pilot purgatory and innovation suffocating in red tape between divisions and levels of management.

Though most of the current projects at KLM’s Digital Studio are still in the experimental stage, a handful have turned into practice. The studio has embraced KLM employees of all different backgrounds and roles, who may not have otherwise had the opportunity to take their transformative ideas further. Take Chris Koomen, who was stationed in KLM’s engineering and maintenance division: Chris had an idea for using VR, so he joined the Digital Studio and has been a part of integrating VR for training aircraft crew. Another idea pitched by a KLM mechanic involves using AR in aircraft and engine maintenance.

Every four weeks, the Digital Studio hosts a demo of what it’s working on to interested observers. The lesson here is don’t hide emerging tech in a lab unless you’re going to let the user in. Show employees what’s out there, give them resources, and let those who perform the job every day tell you how to transform the business.


“The customer experience is the next competitive battleground.” – Jerry Gregoire, former VP & CIO of Dell


Lowe’s

Despite the impression one might get from HGTV, building things is not easy for the non-professional. Planning a home improvement project, shopping for building materials, executing the project…what’s most difficult for the average consumer, even a hardcore DIY-er, is visualizing the final product. But it seems a solution has finally appeared in the form of XR (AR, VR, MR), and all the major home improvement brands recognize the potential. There are now apps for virtually measuring your surroundings and picturing all kinds of design options and home products in your real space. And it’s not just the Lowe’s and Home Depots of the world—architects and engineers have seized upon VR to help clients visualize new structures, real estate agents are giving virtual home tours, and even Gulfstream Aerospace employs XR so its clients know exactly what their custom jets will look like when delivered.

Lowe’s has been conspicuously innovative in making the benefits of XR available to its customers. For the last four years, powerful new immersive technology design and shopping tools have been brewing in Lowe’s Innovation Labs. Josh Shabtai, Director of the Labs Productions and Operations, says he looks at those problems that keep resurfacing. Since the introduction of Holoroom How-To in 2014, Lowe’s Innovation Labs has rolled out an impressive suite of mobile apps / pilot projects to gauge customers’ comfort level with XR, including Lowe’s Vision, In-Store Navigation, and View in Your Space.

Lowe’s is trying to solve the classic pain points of home improvement shopping by giving customers the ability to see with the eyes of a contractor or interior designer, determine whether products fit in their space, virtually tile a bathroom, operate a power tool, and more. By focusing on customer problems, Lowe’s has made some of the strongest cases for consumer AR and VR to date. The retailer’s steady flow of practical immersive experiences even landed it at the top of a list of most innovative companies in AR/VR by Fast Company!


With each employee-generated idea, KLM not only gains a potentially transformative technology solution but also primes its workers for the change to digital—there’s no need to convince employees to use solutions they helped conceive of. And with each application, Lowe’s refines the XR tools that future consumers will use to visualize spaces and learn new skills; ideally positioning itself to scale when the time comes, build customer loyalty and future-proof its business from online competition.

 

The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 5th annual EWTS will be held October 9-10, 2018 at The Fairmont in Austin, TX. For more details, please visit the conference website.


Augmented World Expo (AWE), the world’s #1 AR+VR conference and expo, comes to Munich, Germany on October 18-19, 2018. CXOs, designers, developers, futurists, analysts, investors and top press will gather at the MOC Exhibition Center to learn, inspire, partner and experience first-hand the most exciting industry of our times. Tickets now available at www.aweeu.com.

 

Image source: Lowe’s via Road to VR

Over 20 Use Cases of Smart Glasses, VR Headsets, and Smartwatches at Airports

If going through airport security is a flyer’s biggest pain, then capacity is the airport manager’s living nightmare: Airports around the world today are hard-pressed to process more passengers and cargo than their terminals were originally designed to manage, and projected air traffic growth indicates no coming relief. Most American airports were built between the 1950s and 1970s. Take Chicago O’Hare International Airport: By 1975, O’Hare was the world’s busiest airport, handling 37 million passengers a year. In 2017, more than double that number – 79.8 million people – traveled through O’Hare, along with 1.9 million tons of cargo!

Capacity issues have led to a multibillion-dollar infrastructure crisis in the airport industry, not to mention low customer expectations on the part of airlines and airline passengers (airports’ two main customers). It’s not enough for the industry to work on quickly processing travelers and avoiding delays; improvements and solutions are needed for the end-to-end travel journey, as well, including the terminal experience and flying between destinations. The pressure is on for airports to invest in new technologies that improve the efficiency of airport processes and reduce service disruptions; thereby allowing passengers to spend less time in queues and more time enjoying airport facilities.


Ideas on the ground and on board:

The airport industry first began toying with wearable technology with the release of the original Google Glass in 2013. Early on, a number of airlines trialed smart glasses at the boarding gate and offered digital boarding passes for consumer smartwatches. More recently, the use of wearable augmented and virtual reality devices by airport and airline technicians to train, perform maintenance, and receive remote support has gained traction. Additionally, the growing popularity of AR and VR in architecture, engineering and construction has implications for the future of airport renovations and new airport design. Other ideas floating around look to a future in which travelers regularly use wearables and even lightweight smart glasses to receive real-time flight notifications, directions to their gate, and pre-flight shopping and dining promotions.

In IATA’s 2017 Global Passenger Survey, 85% of those surveyed indicated they would be willing to give up more personal data in exchange for faster process checks and more personalized service at the airport. As consumers become increasingly receptive to sharing wearable-generated biometric data and are exposed to augmented reality via smartphones; ideas like replacing traditional travel documents with personal wearables and implementing AR wayfinding in airports seem less and less far-fetched.

The history of wearable technologies in the airport industry:

From supporting airport ground operations with AR to in-flight VR entertainment; the airport industry has experimented with wearable technologies throughout the travel experience. In fact, airports and airlines gave us some of the earliest – and incredibly imaginative – use cases of Google Glass Explorer Edition, arguably the device that set enterprise wearables in motion. Let’s look back:


Early trials:

Virgin Atlantic’s 2014 trial at London Heathrow Airport – in collaboration with SITA – included both Google Glass and the Sony SmartWatch 2. Staff at the airline’s premium entrance at Heathrow used the devices to view individual passenger information and real-time travel updates. This allowed agents to greet first-class passengers by name, process them quickly for their flights, and provide the most up-to-date travel information. The following year, Virgin partnered with Sony to equip its Heathrow engineers with the Sony SmartWatch 3 and Sony’s SmartEyeglass to test out real-time job notifications and live video streaming to remote expert technicians.

Around the same time, Vueling, Iberia, and Air Berlin launched smartwatch boarding passes for early Pebble and Samsung smartwatches. EasyJet and British Airways followed with Apple Watch apps allowing travelers to receive real-time flight updates and board their planes with just a flick of the wrist.

Japan Airlines made another early attempt to prove Google Glass in maintenance, with airline personnel wearing Glass on the tarmac at Honolulu Airport so that experienced staff at headquarters could inspect planes remotely. Airports got into the game, as well; including Copenhagen Airport, where duty managers used Google Glass to document issues and answer travelers’ questions on the spot, and London City Airport, which considered how Glass might be leveraged in its operations. Allegiant Systems, a software company, also developed a proof of concept in which airline staff used Vuzix smart glasses to create a more personalized passenger experience. Scenarios included using the glasses at security, at the gate, and at the door to the aircraft to identify passengers (facial recognition tech) and to view preferences of frequent First-Class fliers in the air.

While these trials made an early splash for wearables, most did not amount to full-blown adoption. This was especially true in the case of smartwatches. Smart glasses did, however, enable workers to keep eye contact and better engage with customers.


Later use cases:

By 2017, the idea of using smart glasses to improve airport processes no longer seemed so futuristic. That year, SITA worked with Helsinki Airport to explore visualizing airport operations with the Microsoft HoloLens. Using the feed from its Day of Operations software (already in use by Helsinki Airport), SITA reproduced the airport operational control center (AOCC) in mixed reality. This made for a new way of visualizing and analyzing the airport’s complex operational data (aircraft movement, retail analytics, etc.) to make decisions. It also allowed remote viewing of the AOCC in real time.

Along with delays, heavy commercial passenger and cargo traffic can produce unexpected changes in an airport’s operations that put the airport’s facilities to test. Cincinnati / Northern Kentucky Airport (CVG), which sees 6.7 million passengers a year, turned to wearable technology when quality metrics revealed that the state of the airport’s restrooms had a great impact on traveler satisfaction. In what became a successful use case, CVG installed counting sensors in its restrooms and gave housekeeping staff Samsung Gear S3 smartwatches with Hipaax’ TaskWatch platform. The sensor data helped to better direct staff resources, so instead of following a standard cleaning schedule, housekeepers were notified in real time via smartwatch when a nearby restroom required attention.

Out from behind the scenes in 2017, AR and VR began to make more public appearances in the airport industry. Heathrow Airport worked with Ads Reality to create an augmented reality app for entertaining and distracting children – some of them first-time travelers – during the long wait to board a flight. As an added benefit, tracking the triggering of the AR markers through the airport’s five terminals also tracked foot traffic, revealing busy areas where customer experience could be improved. Qantas Airways was actually the first to introduce VR headsets, partnering with Samsung in 2015 to bring the devices to select first-class cabins and lounges for travelers to virtually experience some of Australia’s greatest attractions (like the Great Barrier Reef). The airline has since released a multi-platform mobile app showing off Australia’s beautiful scenery, with the goal of inspiring consumers to book with Qantas.

Using VR as a sales tool has been popular at other airlines, too, including Lufthansa and KLM Royal Dutch Airlines, which offer VR experiences of destinations and the aircraft itself to encourage seat upgrades. The KLM Flight Upgrader is a VR experience enabling people on budget flights to “pretend” to fly KLM, complete with in-flight movie, reading your favorite newspaper, and a virtual meal served by a caring crew. Singapore Airlines, Eithad Airways and Finnair have also experimented with VR to show off their airplanes, cabin classes, and travel destinations. Very recently, Air New Zealand announced a partnership with Magic Leap and Framestore to develop MR content highlighting New Zealand as a travel destination. The airline has also trialed HoloLens for displaying key passenger information like preferred meal choices and emotional state to flight crew and Google Pixel Bud Bluetooth earphones to help employees with live translation onboard and in the airport terminal.


Most recent

Late 2017 saw larger and more ambitious trials of wearable technologies at airports. Changi Airport, one of the busiest in Asia, announced plans to pilot 600 pairs of smart glasses among its staff to improve the accuracy, efficiency and safety of cargo and luggage handling. Using its camera to see visual markers and labels on luggage and containers, the glasses project information like loading instructions on top of the user’s real-world view, shortening loading time by as much as 15 minutes. This will create a competitive advantage for the Changi’s airline customers, while video streaming will allow real-time monitoring of ramp handling operations.

Hamad International Airport signed a Memorandum of Understanding with SITA, providing a framework to trial biometrics together for seamless identity management across all key passenger touch points at the Doha airport, along with robotics, blockchain, AR and VR.

Though Copenhagen Airport was actually the first to provide an AR wayfinding tool back in 2013, Gatwick Airport installed 2,000 beacons to enable the same in 2017. At Gatwick, through which 45 million people travel every year, passengers can use their smartphones to view AR directions to wherever they need to go. Helping people navigate the airport prevents minor disruptions resulting in late departures and missed flights. It’s also the perfect use case for consumer AR glasses, allowing you to travel heads-up to your gate with your hands just on your luggage.

SkyLights, maker of immersive, cinematic in-flight entertainment (IFE), has content partnerships with the likes of 20th Century Fox, DreamWorks, and BBC. Last year, Air France and Corsair trialed SkyLights’ Bravo Evo VR headset in some business class cabins. In the spring of this year, Emirates and Eithad announced their own trials of the new Allosky headset in select first- and business-class lounges. Japan Airlines and JetFly have also tested the headset, which can store up to 40 high-def films including five VR titles. Such VR entertainment could transform the cabin experience.

In the last few months, both Philippine Airlines and Lufthansa have revealed they’re using VR for training. Lufthansa is just the latest in the aviation world to consider VR for pilot training. The German airline already uses VR to teach flight attendants how to search the aircraft for foreign objects and is now seeking to keep up with the growing attrition rate among its 10,500 pilots. Philippine Airlines is applying the technology to cabin crew training, which, unlike flight simulators, has evolved very little over the years. The first batch of cabin crew trainees to use VR are now being deployed to select craft.


Future

Whereas the use cases for wearable technologies in industry – on the construction site, in the factory, etc. – are clear, consumer-facing industries like retail, financial services, and travel are less certain about how to go digital. There’s no shortage of experimentation: In the last five years alone, the airport industry has turned to wearables to make boarding more convenient, improve the in-flight experience, better understand airport operations in order to correlate events and manage staff, speed up flight inspection and turnaround, entice consumers to upgrade their travel, distract those waiting for flights, and more.

Wearable and immersive tech is accelerating across the industry, most recently popping up in air traffic control, and even carving out new revenue streams as in the case of First Airlines, the world’s first virtual airline based in Ikebukuro. Actual consumer-facing use cases, however, are not really sticking; but what has been consistent from trial to trial ever since gate agents for Virgin Airlines first put on Google Glass is that feedback is largely positive—consumers generally support technology that will speed up and simplify the airport experience. IATA’s Global Passenger Survey confirmed this last year. Passengers may not be aware of wearable notifications flying across airport hubs but they do notice when airline employees look them in the eye, know the answer to all their questions, and predict their beverage choice before the cart reaches their row. 

EHS 2.0 with Digital Advancement: How General Electric is Digitizing Safety

Improving worker safety is a mission that never seems to end. Hazards in the workplace are always evolving, as are the gear, tools and methods developed to mitigate risks. Our understanding of safety in the workplace is also evolving: For instance, though it’s hard to quantify, we know that safety has a great impact on productivity. Nevertheless, according to Nationwide, 51% of businesses don’t have an Environmental, Health and Safety (EHS) specialist on staff while 38% don’t offer any formal safety training.

Though wearable technologies, including body-worn sensors, heads-up displays and robotic suits, are being touted as promising safety solutions for industrial workers; it was only two years ago that the U.S. Bureau of Labor Statistics reported the highest number of work-related deaths in nearly a decade. The rise of fatal injuries on the job, however, does not mean that wearables aren’t ready for primetime or that companies aren’t using them. It means organizations are not yet leveraging these technologies to their full capacity as part of a larger, connected and proactive system for safety in the workplace.

There are, in fact, effective wearable safety devices today. According to Sam Murley, EHS Digital Acceleration Leader at GE, General Electric is exploring and deploying them in nearly 40 pilot and deployment programs around the globe: “These are things that can save lives today, the same way insulated gloves and hard hats do…These technologies don’t live in labs; they’re ready to go. The obstacles lie in communicating the value when in place and identifying new stakeholders to help drive broader adoption.” Plenty of GE-league companies are, at the very least, piloting wearable solutions—enough so that for the first time the EWTS 2018 program will devote an entire afternoon track to safety and training case studies. The devices range from simple sensor-embedded bracelets to VR headsets and partial exoskeletons, and cases are springing up across all sectors: In addition to Sam, speakers from retail giant Walmart and multinational brewing company Molson Coors will share first-hand experiences of using wearables to increase safety in their organizations.

A number of factors could explain why wearable safety tech isn’t exactly making waves in enterprise: Lack of awareness (a lot of the focus is around AR/VR), the challenges of choosing the right use case and gaining internal support and funding, the complexity of Big Data (translating raw wearable data into actionable safety insights), and even generational differences (Millennial business owners are leading the adoption of connected technologies for safety). While there is a lot of buzz around augmented and virtual reality devices for heads-up information, training and remote support (all of which influence the user’s safety); wearables that track employees’ physical condition and blend into their work attire are less glamorous and less obvious when it comes to showing ROI. Take something like location tracking: A simple GPS-tracking band coupled with geofencing could help keep employees out of known hazard zones, but how do you quantify that in terms of cost savings? More exciting tech like exoskeletons poses the same challenge: If you have 10 less injuries than last year after giving exoskeletons to a group of welders, what is the ROI?

When asked to give advice to EHS managers just beginning to look at emerging technologies, Sam Murley said “Know what problems you’re trying to solve and leverage what has been done in the past.” Taking that advice, here are a few recent initiatives at GE that provide not only example use cases but also best practices and a look into the future of wearable and other emerging technologies in EHS: In the very near future…we’ll completely digitize the way risks are managed…Workers will have a digital toolkit of wearables at their disposal as required PPE [personal protective equipment] as well as optional tools they’ll use to augment some of their work. As long as it doesn’t over-innovate the user and has data value, EHS in organizations could potentially get to zero quo.” – Sam Murley, GE


Working with and wearing robots:

Robots are increasingly taking over dangerous and repetitive tasks in the workplace. At GE, the choice between deploying a companion robot with a human worker and augmenting the worker with an exoskeleton comes down to “how hazardous the task is and how long you need the human brain involved in the process.” In the case of the dangerous and dirty job of inspecting a dark chemical storage tank, GE has been testing a 4-foot-long, snake-like robot made by Sarcos Robotics. Equipped with magnetic tracks, ‘Guardian S’ can slither up and down the walls of the storage tank and across the debris- and grime-covered floor, using embedded sensors in its head and tail to perform the inspection and share information with workers outside the tank. There’s no need to stop the operation or have rescue services on standby.

If you’re wondering what happens to the workers relieved of this hazardous task by Guardian S; they become the operators and decision makers or are otherwise reassigned to less dangerous jobs. GE’s interest in robotics is not about replacing humans but rather augmenting them, allowing workers to complete tasks in hazardous, inaccessible, and unstable environments without putting themselves at risk. Not only does Guardian S keep human workers safe; it’s also better and faster at its job. The human-managed technology can even be customized with features like magnets, boom cameras, and ultrasonic thickness sensors to perform tasks in a variety of work environments, from power-generation facilities to oil sites and wind turbines.

Sarcos Robotics also makes a pair of track-mounted robotic arms to help users lift heavy objects and is working on a load-bearing exoskeleton to enhance human strength. GE is very interested in wearable robotics to improve and simplify EHS and increase productivity across its operations. Along with other big companies like Delta and BMW; GE has joined Sarcos’ new Exoskeleton Technical Advisory Group (X-TAG), created to advance exoskeletons in industry. The technology has enormous potential: Robotic suits will match human intelligence and improvisation with machine strength and precision. Workers’ physical performance and wellbeing will improve; less manpower will be required to do the same amount of work; and workers’ compensation, healthcare and downtime costs will decrease.


A proactive stance on safety with AI & wearables:

When asked what makes a killer application of new technology at GE, Sam Murley replied, “When you have edge-to-edge systems that can protect the worker directly and push data from the worker and environment back to a system to intervene…Those are killer platforms and there are a few out there that we’re using right now.” GE began piloting such a platform in 2016—specifically, two injury prevention systems by StrongArm Technologies that combine wearables, data analytics and machine learning (AI).

GE workers at several sites worldwide wore ErgoSkeletons (like a cross between a smart harness belt and a backpack) while lifting and carrying heavy loads, performing repetitive tasks, and during highly complex procedures. These passive exoskeletons work by redistributing weight from a central point of the user’s body across stronger areas of the body or by supporting arms and legs during overhead work, thereby preventing back, shoulder, arm, and leg injuries while increasing product quality. The exoskeletons can be worn with or without StrongArm’s FUSE ergonomic sensor which tracks the user’s ergonomic movement through their data analytics software and provides live coaching via haptics for safer posture and physical technique.

In addition to getting workers to perform better and use their full body (relieving strain on the arms and lower back), the solution generates real-time data that can give insights into EHS at GE. With AI, GE managers can isolate problematic ergonomic areas and make preventative changes to the work environment as well as figure out which workers need intervention and training.


According to IBM and Cisco, 2.5 quintillion bytes of data are created every day, and most of it is never captured, analyzed or used. Wearable technology can provide gigs and gigs of safety-related data but if that data lives in a vacuum, it’s wasted: “I think the most successful technology gives you immediate feedback while measuring some activity in the human body or environment and tying it back into a decision-making platform.” – Sam Murley, GE   

GE is taking a well-rounded digital approach to EHS, using wearable and other emerging technologies to digitize safety. Beyond robotic enhancements and ergonomic sensors; heads-up displays, VR headsets, lone worker management devices, hazard-sensing bands, and even drones are presenting EHS pros with new ways to protect and empower workers, make training more effective, reduce injury and costs, and enable data-driven decision making on both a micro and macro level.

*For more expert insight on how GE is finding solutions, setting up pilots and working through deployment issues, read our full interview with EHS Digital Acceleration Leader and EWTS 2018 presenter Sam Murley here.

 

The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 5th annual EWTS will be held October 9-10, 2018 at The Fairmont in Austin, TX. For more details, please visit the conference website.


Augmented World Expo (AWE), the world’s #1 AR+VR conference and expo, comes to Munich, Germany on October 18-19, 2018. CXOs, designers, developers, futurists, analysts, investors and top press will gather at the MOC Exhibition Center to learn, inspire, partner and experience first-hand the most exciting industry of our times. Apply to exhibit, submit a talk and buy Super Early Bird tickets now at www.aweeu.com.

 

*Image source: Sarcos Robotics

Making Your Next Flight Safer and Smoother with Wearable AR+VR

From building the actual plane to the in-flight experience, wearable XR (AR, VR, MR) devices have a role to play in multiple professions within the commercial aviation industry. Employees whose jobs affect every aspect of one’s trip, including aircraft maintenance workers and flight crew can make use of wearable XR technologies to ensure the end goal: A safe and satisfied traveler. Find out how XR might be used on the ground and in the air when you go on your next business trip or vacation:


On the Ground: AR for Assembly

Both Airbus and Boeing employ augmented reality (AR) glasses in the aircraft assembly process. Airbus workers follow plans directly in their field of view, superimposed on the plane’s interior during cabin installation. They use the same solution to check the accuracy and quality of their work (image recognition technology and artificial intelligence at work); while Boeing employees use smart glasses to view a heads-up, hands-free roadmap for wire harness assembly over their real-world view. In each case, AR functions to form a stronger connection for the user between textual or diagrammatic instructions and the real working environment.

Using AR glasses with software by Upskill helped Boeing save tens of millions of dollars, but it’s not all about money: By helping employees work faster without error, aircraft manufacturers can deliver defect-free planes to customers quicker. Airlines and other buyers thus receive faster-built, higher quality aircraft and parts that breakdown less often. Aircraft and parts engineers can also use AR and VR devices to collaborate on new designs from anywhere in the world, sharing and testing ideas and even simulating the assembly or installation process to foresee issues. New XR platforms are only making this collaboration easier.


VR for Training

After assembly comes maintenance: It can take up to eight years to train and license an aviation maintenance professional. This includes aircraft OEM mechanics and airline technicians who perform safety checks, prepare aircraft components for flight, make repairs, and more. While accessing real aviation equipment for hands-on training is costly and difficult, in VR trainees can practice skills in a realistic, accident-proof immersive environment with virtual parts and tools. For instance, a mechanic wearing a VR headset could walk inside an engine and examine its parts as well as simulate different repair scenarios. With advanced audio and haptics (like a haptic suit), the trainee could even hear the noise and feel the motion of the engine, better preparing him for the real thing.

A recent study at the University of Maryland found that people actually learn and retain information better through immersive experiences compared to using a computer or tablet. Enterprises are also finding VR to be superior to reading a manual, watching videos, or taking a lecture-style class. While not an example of full immersion, Japan Airlines used Microsoft’s HoloLens to improve training for its engine mechanics—in place of physical hangouts, trainees learned all the engine components by working on a virtual engine in mixed reality.

Learning by doing with AR is effective and cost-saving for training, as well. Aviation maintenance workers can learn on the job without risk of error by using heads-up, hands-free smart glasses to view fool-proof text and visual aids over their work. The technology can even validate each step of an inspection or repair to prevent errors. Static instructions can become interactive, with virtual arrows and labels appearing on top of real-life aircraft equipment, showing the user where parts and tools should go. The result: Faster training without sacrificing accuracy or quality = quicker maintenance, fewer flight delays, and happier travelers.

Once the engine has been overhauled, the plane is ready for service. Expensive and logistically challenging, pilot training is another opportunity for VR. In recent years, the burden of paying for flight school has fallen onto pilots themselves. The $60,000-$80,000 price tag explains why flight school enrollment has fallen in the U.S., leading to a growing shortage of trained pilots not all that unlike the troubling shortage of skilled workers in other industries. CAE forecasts that over 255,000 pilots will be needed in the global commercial aviation industry by 2027, yet less than half that number has even begun training. Some carriers and manufacturers are making efforts by sponsoring aspiring aviators or expanding their flight training services, but the cost and time is still too great.

For industries with large, complex and expensive equipment like aviation, VR offers the closest thing to hands-on training. Virtual reality, capable of simulating almost every aspect of flying, feels more real than many current flight simulators (essentially stripped airplane cockpits with screens for windows) and is adaptable to all kinds of scenarios. Rookie pilots can walk around the cockpit, interact with the plane’s controls, and even practice an emergency landing, with tactile feedback to increase the sense of realness and help build muscle memory. VR is already finding its way into flight training programs: Airbus, for one, has been able to reduce training time and train more people in limited space using VR to supplement training in real aircraft; while Future Visual created a simulation for Oculus which takes pilot students through the entire pre-flight process. And VR isn’t just for ground crew and pilots; cabin crew and even airport staff training could incorporate immersive tech, as well.


In the Air: AR for Guidance

The length of runway required for a standard aircraft to get off the ground can be calculated, but what if there are unexpected failures? What if the engines aren’t working to full capacity or the takeoff field is wet? Will the aircraft still reach the required speed for takeoff? According to Boeing, 13% of fatal aircraft accidents occur during takeoff. In fact, pilot errors, not maintenance failures, are responsible for the vast majority of all aviation accidents. This isn’t surprising considering it’s largely left to the pilot’s subjective opinion to determine a response when something goes wrong.

The problem lies in how information is presented to the pilot inside the cockpit. It’s hard to focus on flying when you have to read and quickly analyze the text on a bunch of small instruments and screens all around you. AR technology can display this information in a more intuitive format. For instance, with smart glasses, information like pre-flight checklists, step-by-step instructions, current weather and air traffic information, even a 3D graphic of the takeoff path can appear overlaid in a pilot’s vision before takeoff. Aero Glass actually has a solution that displays flight path and instrument data to small airline pilots wearing smart glasses. The same cockpit information a pilot might get using physical controls and touch screens can be retrieved instead by voice command; and when a snap decision needs to be made during a flight, AI technology can pick out the most relevant information to display to the pilot.


XR in Flight Service?

The benefits of integrating AR glasses and VR headsets into aircraft assembly and technician training are tangible today, but at this point airlines have merely proposed ideas for using XR in the air without seriously investing. This is probably due to the consumer-facing nature of the in-flight experience. Providing flight attendants with smart glasses to interact with passengers or offering VR headsets as in-flight entertainment are not critical use cases like the need to quickly train thousands of new pilots. Moreover, the timeline for mainstream consumer use of AR and VR is still unclear.

XR hasn’t yet transformed the experience of flying, but some airlines are considering it. Air New Zealand, for example, had its crew members try out HoloLens to expedite and provide more tailored customer service during the flight. To cater to individual passengers, flight attendants might access their flight details (to help make connections), food allergies (to personalize meals), even their emotional state (facial recognition tech). Air France trialed VR headsets for in-flight, immersive entertainment; and though not in the air Lufthansa has used VR to sell upgrades to premium class right at the gate. Who knows? Maybe one day those safety instructions in your seat pocket will be replaced by a virtual reality video. In the meantime, rest assured that XR technologies are improving aviation operations behind the scenes, from the hangar to the cockpit.

 

The Enterprise Wearable Technology Summit (EWTS) is an annual conference dedicated to the use of wearable technology for business and industrial applications. As the leading event for enterprise wearables, EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. The 5th annual EWTS will be held October 9-10, 2018 at The Fairmont in Austin, TX. For more details, please visit the conference website.


Augmented World Expo (AWE), the world’s #1 AR+VR conference and expo, comes to Munich, Germany on October 18-19, 2018. CXOs, designers, developers, futurists, analysts, investors and top press will gather at the MOC Exhibition Center to learn, inspire, partner and experience first-hand the most exciting industry of our times. Apply to exhibit, submit a talk proposal and buy Super Early Bird tickets now at www.aweeu.com.

Driving Ahead: Car Companies Using XR to Adapt in a Post-Uber World

My last blog post built upon Uber’s wrecking ball-style entrance into the cab industry. Less obvious is Uber’s impact in the automotive sector, where the app is creating waves for car manufacturers. Ride-sharing is just one of the trends forcing the auto industry to transform. In fact, some industry observers believe automotive is about to have its most dramatic revolution since Henry Ford’s time.


State of the Auto Industry

Changing Attitudes Towards Vehicle Ownership and Declining Sales

Private car ownership is becoming less and less necessary, practical and desirable in many cities around the world thanks to the rising costs of urban living, civic measures to discourage car use, worsening traffic and lack of parking, and always-available services like Uber and Lyft.

More car ownership trends: As cars have become more reliable, people are holding onto them for longer or opting for used cars. Delayed by student debt and economic uncertainty, young professionals aren’t moving to the suburbs like their parents did; and younger Americans simply prefer ordering a car via app to owning one—all reasons why vehicle sales declined in 2017 for the first time in years. This downward trend will likely continue; for while ride-hailing makes owning a car unnecessary, in a future with self-driving vehicles people won’t even need to know how to drive.


Ride-sharing and the New Car Buyers

Uber has forever changed how we get around, but why is this problematic for automakers? It’s not like ride-hailing is making cars obsolete. The issue is Uber’s impact on consumer behavior. Automobile manufacturers have been marketing new vehicle designs and features to customer types that are pulling away from car buying (for now). Take the new driver: Learning to drive has traditionally been a rite of passage for suburban teens, but far fewer millennials have driver’s licenses today compared to older generations. So, what will the future of car ownership look like?

In the future, ride-sharing companies and contractors – less discriminatory than traditional car buyers – may very well be the auto industry’s top clientele, and vehicles may become increasingly homogenized as a result. Though still far away from fleets of robocabs, car culture is changing: Personal cars don’t have the same social status they used to, and ride-sharing vehicles are invading city streets. Automotive companies must adapt to the social change brought by new mobility services.


The Race to Get Connected and Achieve Autonomy

On top of the classic goals of reducing costs, improving fuel efficiency, increasing sales, etc.; auto companies today are competing to redefine consumers’ relationship with cars and invent the future of driving. They’re designing ever-more futuristic vehicles – battery-powered, self-configuring, able to track the driver’s health and predict maintenance – and investing in the technology to build them: Cloud infrastructure software and analytics, artificial intelligence, mapping systems, plus the talent and expertise to go with these and other bleeding-edge technologies.


Autonomous vehicles may eventually boost private car ownership; but while companies race to develop the first commercially viable self-driving car platform, today’s drivers want better, smarter dealership and driving experiences. As the level of technological convenience and control in their lives increases, consumers expect more of every product and service offered to them. And though ride-sharing and the promise of self-driving vehicles in the next five years threaten to upend the entire model of car ownership, automakers cannot afford to neglect regular drivers. They need to continue to make and sell new cars, delivering semi-autonomous and connected driving upgrades and revamping the car buying process to lure people into dealerships and keep them in the brand.


Getting Ahead with XR: Ford, Volkswagen and Porsche

Arguably more than any other industry, the automotive sector has been the most aggressive in its wearable tech adoption. Auto companies have had the most success implementing exoskeletons, and they’re exploring Augmented and Virtual Reality in multiple areas of the automotive business. Read how Ford, Volkswagen and Porsche are using XR to advance their operations, improve the customer experience and bolster their brands amidst unprecedented change in the auto industry:


Ford

In addition to providing assembly line workers with upper body exoskeletons to reduce the physical toll of repetitive overhead tasks, Ford has been working to develop VR platforms for both its customers and designers.

Last year, after an initial pilot phase at its Design Studio in Cologne, the auto giant expanded its use of Microsoft’s HoloLens. The technology enables Ford designers and engineers to more effectively work together on confidential designs and quickly model out changes to vehicles, viewing those changes on top of a real car as opposed to the time-consuming and expensive clay model approach. Ford hasn’t entirely abandoned clay models but with Mixed Reality, designers don’t have to build out a new clay prototype after every design decision; they can just augment the 3D model.

At Ford, Mixed Reality is proving to be a boon to innovation, collaboration, and time to market—improvements that will aid the American auto brand’s efforts to reimagine vehicles, deliver a better in-vehicle experience, and differentiate itself through design. Beyond vehicle design, Ford envisions consumers using AR/VR headsets at home to customize cars and create their own virtual test drive experiences; and Ford dealers using state-of-the-art hologram display cars to more effectively utilize showroom space.

(^Elizabeth Baron, Technical Specialist in VR and Advanced Visualization at Ford, will speak at EWTS 2018.)


Volkswagen

In Fall 2017, Volkswagen established a Digital Realities team encompassing 12 of its brands across 120 sites around the world and a Digital Reality Hub to enable long-distance collaboration among team members. The German automaker had been experimenting with HoloLens at its Virtual Engineering Lab in Wolfsburg, to project designs onto a scale model of a VW Golf; and exploring how to apply the technology to technical development.

From these efforts came the Digital Reality Hub, which combines multiple group VR applications and tools into one platform allowing designers and engineers all over to work on the same project simultaneously, exchange and test ideas, and even participate in virtual workshops. In addition to new vehicle models, real locations like factory production lines can be modeled in the virtual environment to trial optimization measures without the need for site visits.

It cannot be overstated how much XR impacts productivity or how critical an efficient network among Volkswagen’s global brands will be to the company’s success in the next phase of the auto industry. Most recently, VW teamed up with VR studio Innoactive to create more than 30 VR training scenarios for the HTC Vive Pro. The automaker plans to train 10,000 employees in production and logistics this year using Virtual Reality.


Porsche

In November, Porsche introduced the “Tech Live Look” Augmented Reality solution for dealerships, which consists of Atheer’s AiR Enterprise software platform running on smart glasses. Wearing the glasses, an L.A.-based service technician can connect with Porsche’s technical support team over 2,000 miles away in Atlanta and receive remote expert help in identifying and resolving technical issues. The remote expert can take screen shots of the tech’s view or project instructions into her field of view while she works—far more efficient than an email or phone call.

In a July 2017 pilot program across eight dealerships, the “see-what-I-see” technology helped decrease service resolution time by up to 40%. Not only is this the kind of quick turnaround service consumers are coming to expect, but when the solution launches this year it will be a real differentiator for the luxury car brand. Again, as the technology inside vehicles gets more advanced and as companies like Porsche transition from the mentality of car as a product to vehicle as an experience; the capabilities offered by XR – better communication, productivity, visualization, decision making, problem solving and customer experience – become more significant.

(^Heather Turney, Culture and Innovation Manager at Porsche, will speak at EWTS 2018.)


With all the disruption caused by new alternatives to vehicle ownership, new energy options, 3D printing of auto parts, AI, self-driving tech, etc.; it’s more important than ever for automakers to optimize operations, automate assembly lines, engage consumers, and prepare the workforce for more complex manufacturing and IT-heavy jobs. One major step is to adopt XR as a standard tool for design, training, production and customer service. After all, how can you expect to build the future if your factory and workforce are still in the past? How can you invent the future if it takes days and weeks to collaborate and review designs? And how can you sell the future if consumers aren’t excited about it?

 

The 5th Annual Enterprise Wearable Technology Summit 2018, the leading event for enterprise wearables, will take place October 9-10, 2018 at The Fairmont in Austin, TX. EWTS is where enterprises go to innovate with the latest in wearable tech, including heads-up displays, AR/VR/MR, body- and wrist-worn devices, and even exoskeletons. For details, early confirmed speakers and preliminary agenda, please stay tuned to the conference website.


Augmented World Expo (AWE,) the world’s largest conference and expo dedicated to Augmented and Virtual Reality, is taking place May 30-June 1, 2018 in Santa Clara, CA. Now in its 9th year, AWE USA is the destination for CXOs, designers, developers, creative agencies, futurists, analysts, investors and top press to learn, inspire, partner and experience first-hand the most exciting industry of our times.

photo credit: Pittou2 Salon de L’auto Tesla Model S via photopin (license)

How Wearables are Contributing to the IoT

The Internet of Things is a network of (Internet-) connected objects that have been made “smart” with embedded sensors. These devices collect data about the physical world around us, about processes and the health of both people and machinery—data which is then interpreted and shared with other machines and people via cloud-based software platforms. Accenture calls it “a universe of linked devices, services and people.”

Cloud computing, edge processing, big data analytics, artificial intelligence—these are the technologies that meaningfully connect all our billions of things, forming a complete IoT solution. An IoT solution is a system of connected devices and technologies used to do something like predictive maintenance, where the failure of a piece of equipment can be predicted and the right person automatically sent to repair it before a break down occurs.

In enterprise, IoT solutions can boost productivity, streamline business operations, cut operating costs, generate new revenue streams, and deliver better customer experiences. How? Here is Microsoft’s action plan for transforming an organization with IoT:

  • Build things: IoT begins with your things, with strategically adding sensors to devices
  • Control things: Deploy IoT solutions that automatically control, monitor and manage your things (your assets,) allowing you to capture real-time data about them
  • Analyze: Apply advanced analytics to convert that raw IoT data into new business insights
    • Traditionally, the tremendous amount of data from machines and legacy systems in the enterprise has been siloed, existing unfiltered in separate stacks. Data analytics can connect individual sets of data, helping companies derive actionable insights, close gaps in efficiency, and even develop new business models, products and services.
  • Act: Transform those insights into action, to real changes and improvements in your organization

So, where do wearables come into play? In Build and Act. Wearables are hands-free tools for collecting data about people (workers, customers) and their surroundings (environmental factors, machines;) and for sending information and insights gleaned from sensor data to human actors (workers on the ground.) Wearables make IoT insights mobile, putting information in the hands of the doers – or before their eyes via smart glasses – in real time, on the spot, and at the source of the opportunity.

The Internet of Things makes concepts like the connected workplace/workforce, smart supply chain, remote monitoring, and predictive maintenance a reality. Syncing data from various sources to optimize and streamline equipment performance and human activities has applications in many sectors—on production lines, in the field, and on the jobsite. But let’s begin above the clouds, looking at a few of the ways IoT technologies will make Aviation and Aerospace smarter:


In every organization, there is (hu)man, nature and machine: The workforce, the working environment, and the equipment and tools used to carry out the company’s services. In the aviation industry, you have workers in factories building and servicing aircraft, in airports getting passengers and luggage onto planes, and in the air piloting and taking care of travelers.

Take a Boeing 747 and outfit every single component of the plane (engines, flaps, landing gear) with sensors providing real-time performance data. If there were a problem mid-flight – say with one of the engines – that information would be immediately relayed to ground crew (possibly via a wearable device) so that when the plane lands, airport engineers can be waiting with the right parts and tools to deal with the issue. One of those tools might be smart Augmented Reality glasses, with which the engineers pull up schematics and other information for the specific engine in question right in their field of view, overlaid on top of the machine as they inspect and repair it. Should they need help troubleshooting or require information about a part, the service technicians could use the same glasses to get immediate assistance from a remote expert or parts supplier. (In this scenario, smart glasses were probably also used to guide factory workers during the initial assembly of the plane, helping to reduce manufacturing errors that lead to issues in the air.)

Airlines like Virgin Atlantic are implementing similar efficiency-boosting IoT solutions to enable faster plane turnaround. Some are also tracking maintenance equipment so their engineers always know where a machine or tool they need is located, which also helps with faster turnaround. And faster turnaround means more reliable, on-time flights, which means improved customer satisfaction.

Locating objects in real time seems pretty simple and obvious, right? We think of the aviation industry as incredibly high-tech, so the inefficiencies companies in this sector deal with can be surprising. But the tracking/monitoring doesn’t have to stop with repair tools or the plane’s functioning during a flight—there are things in the airport and aviation data from different systems that can be sensor-ized and integrated to improve aircraft performance, airline operations and the passenger experience.

As described above, using sensors to continuously monitor aircraft components allows for the detection of maintenance problems in the air that can be addressed immediately upon landing. Real-time tracking reduces the need for unscheduled maintenance and associated costs, as maintenance systems can be regulated and airline carriers can properly plan for downtime. But a complete IoT solution enables even more visibility and proactivity: Data analytics can comb through sensor information, hunting for anomalies in engineering systems, and incorporate data from outside systems (ex. air traffic control, route restrictions, weather, fuel use, on-the-ground operations, etc.) giving a better overall picture of a flight beyond a single part that needs replacement or repairing.

In addition to real-time aircraft information, connecting sensors throughout an airport – as London City Airport has done – can help ensure flights take off and land on time. Sensors can be used to track passenger flow and behavior through the airport, to track luggage from the terminal to the plane, and to monitor employee activity. IoT combines and assesses all this data, sending insights to the right people on the ground or in the air who can take action and prevent delays. The right person might be the airport staff member closest to a customer problem at check-in or to a gate where additional employees are needed to speed up boarding; or it might be a customer—delivering the most up-to-date flight information and navigation assistance to help travelers get through security and to their gates on time via their mobile devices.

The end goal is seamless plane turnaround, good customer service, and, of course, safety. Wearable devices can convey critical aircraft performance and flight information to airline and airport employees in the air and on the ground. They do this in a hands-free and often heads-up manner. Whether viewed through smart glasses, on smartwatches or even smart uniforms, the information is highly accessible (glanceable;) allowing for immediate comprehension, action and response.

An IoT wearable in aviation can be as “ordinary” as the uniform worn by EasyJet cabin crew and ground staff—essentially a jacket “made smart” with sensors, lights, scrolling tickers and built-in microphones to provide visual guidance and basic flight info to passengers and for direct communication with pilots and fellow crew members. The mics can also be used to get expert assistance in diagnosing technical issues; and an air quality sensor and barometer monitor transform employees into tools for monitoring the work environment. A range of wearable form factors equipped with various sensors can collect biometric and environmental data to keep aviation workers safe—one example would be using a fatigue-detecting device to ensure pilots are alert and flying at their best. So in addition to providing convenient information, wearables also contribute to the Internet of Things in aviation by collecting data on people and places.


No matter your industry, the first step for an enterprise looking to make its operations smarter is to identify pain points. Start with the problem, not the technology. With IoT technologies, we have a greater ability to collect data on every asset and process and “connect the dots,” helping to automate tasks, make factors that impact operations more visible, optimize employee management, provide just-in-time and even preventative information, and more. So, what is the problem? Where in your business do you waste the most time, lose the most money, experience the most errors? What is the source of inefficiency and what information do you lack? Then start applying technologies, using the Internet of Things to see, learn, act and improve.  

photo credit: Pai Shih Jet Engine via photopin (license)

Just in Time: AR/VR Spark a Digital Renaissance in Aviation and Aerospace

About 20 years ago, Boeing, the world’s largest aerospace company, identified the need for a hands-free, heads-up technology in its operations. Flash forward to 2014, when a device fitting this vision (Google Glass) finally appeared on the scene. Today, the aviation and aerospace industries are experiencing a digital renaissance, and the timing is critical for several reasons:

Demand is high

Demand is being driven by two factors: 1) Rapidly aging fleets that need to be replaced or maintained at great cost; and 2) New, more technologically advanced aircraft needed to stay competitive. (Boeing, for one, has a backlog of some 5,000 planes it is under contract to build.) Next-generation aircraft boast features like advanced avionics, noise reduction capabilities, improved interior cabin designs, and greater fuel efficiency. Aviation and aerospace companies are under pressure to ramp up production to replace customers’ older fleets and supply them with state-of-the-art vehicles. And, of course, as demand for new aircraft rises so too does the need to operate and maintain those crafts.

A talent gap is creating a need for fast, low-cost training

As in pretty much all manufacturing sectors, the aviation and aerospace industries are dealing with a skilled labor crunch as experienced workers retire and leave the workforce, taking their careers’ worth of knowledge with them. By some estimates, the aerospace industry is going to need to attract and train nearly 700,000 new maintenance technicians alone by the year 2035. More jobs are being created and more baby boomers retiring than can be filled or replaced by new workers. Aerospace manufacturers and suppliers are therefore looking for innovative technologies to maximize the productivity of their existing workforces and quickly onboard new workers.

The stakes are high: Operations are complex, downtime is costly, safety is crucial, and the market is competitive

Building aircraft (commercial airplanes, military jets, spacecraft, etc.) and the engines and propulsion units that drive them involves extremely complex processes in which thousands of moving parts are assembled in precise order, carefully inspected, and maintained for years. Speed is desirable to meet demand and for competitive advantage, yet there can be no compromise or negligence when it comes to accuracy and safety—after all, we’re talking about aircraft that transport hundreds of passengers across oceans or even dodge enemy missiles at over 1,000 mph. Boeing, Airbus, Lockheed Martin and other large firms are all vying to sell to the U.S. Department of Defense, NASA and large airlines (the aviation, aerospace and defense industries’ biggest U.S. customers;) so errors and downtime are, of course, expensive and bad for business, and can also greatly affect human lives.


To accelerate production, close the talent gap, reduce errors, limit downtime, and improve safety; the leading aviation and aerospace companies are employing wearable technology, especially smart (Augmented Reality) glasses. In general, smart glasses are good for complex industrial processes that are very hands-on, time-consuming, error-prone, and loaded with information—processes like wiring an electrical system or installing the cabin of an airplane. AR glasses and VR headsets are proving useful in aircraft assembly, quality and safety inspection, field maintenance and repair, and training. The technology is providing aviation and aerospace workers with instant, hands-free access to critical information, and reducing training requirements for technicians and operators alike. Here’s how some of the aerospace giants are applying wearable tech in their operations:

Airbus

In 2015, the French aerospace company teamed up with Accenture on a proof of concept in which technicians at Airbus’ Toulouse plant used industrial-grade smart glasses to reduce the complexity of the cabin furnishing process on the A330 final assembly line, decreasing the time required to complete the task and improving accuracy.

Sans smart glasses, operators would have to go by complex drawings to mark the position of seats and other fittings on the cabin floor. With Augmented Reality, a task that required several people over several days can be completed by a single worker in a matter of hours, with millimeter precision and 0 errors.

Airbus went ahead with this application: Technicians today use Vuzix smart glasses to bring up individual cabin plans, customization information and other AR items over their view of the cabin marking zone. The solution also validates each mark that is made, checking for accuracy and quality. The aerospace giant is looking to expand its use of smart glasses to other aircraft assembly lines (ex. in mounting flight equipment on the No. 2 A330neo) and other Airbus divisions.

Boeing

Every Boeing plane contains thousands of wires that connect its different electrical systems. Workers construct large portions of this wiring – “wire harnesses” – at a time—a seemingly monumental task demanding intense concentration. For years, they worked off PDF-based assembly instructions on laptops to locate the right wires and connect them in the right sequence. This requires shifting one’s hands and attention constantly between the harness being wired and the “roadmap” on the computer screen.

In 2016, Boeing carried out a Google Glass pilot with Upskill (then APX Labs,) in which the company saw a 25% improvement in performance in wire harness assembly. Today, the company is using smart glasses powered by Upskill’s Skylight platform to deliver heads-up, hands-free instructions to wire harness workers in real time, helping them work faster with an error rate of nearly zero. Technicians use gesture and voice commands to view the assembly roadmap for each order in their smart glasses display, access instructional videos, and receive remote expert assistance.

Boeing believes the technology could be used anywhere its workers rely on paper instructions, helping the company deliver planes faster. AR/VR are also significantly cutting training times and assisting with product development. For instance, HoloLens is proving useful in the development of Starliner, a small crew transport module for the ISS.

Boeing’s Brian Laughlin will lead a thought-provoking closing brainstorm on Day One of EWTS Fall 2017

GE Aviation

General Electric is using Augmented Reality and other IoT technologies in multiple areas of its far-ranging operations. At GE Aviation, mechanics recently tested a solution consisting of Upskill’s AR platform on Glass Enterprise Edition and a connected (WiFi-enabled) torque wrench.

The pilot involved 15 mechanics at GE Aviation’s Cincinnati manufacturing facility, each receiving step-by-step instructions and guiding visuals via Glass during routine engine assembly and maintenance tasks. At any step requiring the use of the smart wrench, the Skylight solution ensured the worker tightened the bolt properly, automatically verifying and recording every torqued nut in real time.

GE Aviation mechanics normally use paper- or computer-based instructions for tasks, and have to walk away from the job whenever they need to document their work. With smart glasses, workers were 8-12% more efficient, able to follow instructions in their line of sight and automatically document steps thanks to the device’s built-in camera. And reducing errors in assembly and maintenance saves GE and its customers millions of dollars.

Lockheed Martin

In early 2015 it came out that Lockheed Martin was trialing the Epson Moverio BT-200 glasses with partner NGRAIN, to provide real-time visuals to its engineers during assembly of the company’s F-35 fighter jets and ensure every component be installed in the right place. Previously, only a team of experienced technicians could do the job, but with Augmented Reality an engineer with little training can follow renderings with part numbers and ordered instructions seen as overlay images through his/her smart glasses, right on the plane being built.

In the trial, Lockheed engineers were able to work 30% faster and with 96% accuracy. Those workers were learning by doing on the job as opposed to training in a classroom environment, which amounted to less time and cost for training. And although increased accuracy means fewer repairs, the AR solution could be used to speed up the repair process, too, from days- to just hours-long, with one engineer annotating another’s field of view. At the time, however, Lockheed acknowledged that getting the technology onto actual (secured) military bases would be difficult.

Lockheed is also interested in Virtual Reality, seeing AR/VR as key to lowering acquisition costs (all costs from the design/construction phase of a ship to when the vessel is decommissioned.) The company is applying VR to the design of radar systems for navy ships. The challenge lies in integrating the radar system with a ship’s other systems, which requires very precise installation. VR can help identify errors and issues during the design stage and prevent expensive corrections.

Using HTC Vive headsets, engineers can virtually walk through digital mock-ups of a ship’s control rooms and assess things like accessibility to equipment and lighting. Lockheed is also using Microsoft’s HoloLens to assist young naval engineers with maintenance tasks at sea—much more effective than a dense manual.

*Learn more about this application from Richard Rabbitz of Lockheed Martin Rotary Mission Systems (RMS) at EWTS Fall ‘17

Lockheed is allegedly saving $10 million a year from its use of AR/VR in the production line of its space assets, as well, by using devices like the Oculus Rift to evaluate human factors and catch engineering mistakes early. For the Orion Multi-Purpose Crew Vehicle and GPS 3 satellite system, Lockheed ran virtual simulations in which a team of engineers rehearsed assembling the vehicles in order to identify issues and improvements. A network platform allows engineers from all over to participate, saving the time and money of travelling.

Last but not least, Lockheed Martin is also actively developing and testing commercial industrial exoskeletons. Keith Maxwell, the Senior Product Manager of Exoskeleton Technologies at Lockheed, attested to this at the Spring 2017 EWTS. The FORTIS exoskeleton is an unpowered, lightweight suit, the arm of which – the Fortis Tool Arm – is available as a separate product for operating heavy power tools with less risk of muscle fatigue and injury.


While Augmented Reality has been around for decades in the form of pilots’ HMDs, only now has the technology advanced enough to become a standard tool of engineers, mechanics and aircraft operators across aviation and aerospace operations. In a high-tech industry like aerospace, AR/VR are critical for keeping up production during a mass talent exodus from the workforce. Workers won’t need years of experience to build a plane if they have on-demand access to instructions, reference materials, tutorials and expert help in their field of view.

 

The Fall Enterprise Wearable Technology Summit 2017 taking place October 18-19, 2017 in Boston, MA is the leading event for wearable technology in enterprise. It is also the only true enterprise event in the wearables space, with the speakers and audience members hailing from top enterprise organizations across the industry spectrum. Consisting of real-world case studies, engaging workshops, and expert-led panel discussions on such topics as enterprise applications for Augmented and Virtual Reality, head-mounted displays, and body-worn devices, plus key challenges, best practices, and more; EWTS is the best opportunity for you to hear and learn from those organizations who have successfully utilized wearables in their operations. 

Top 3 Applications for Wearable Technology and Augmented Reality in the Automotive Industry

Written by Special Guest Blogger Randy Nunez, Tech Trend Lead, Extended Reality IT Enterprise Technology – Research, Ford Motor Company

As wearables become more pervasive in the consumer market, their use in the enterprise will also expand.  While wearable solutions such as fitness bands and smartwatches can be useful, I think that smart eyewear will have a greater impact in the business environment than for consumers.  Providing information on demand in a hands-free format is a powerful capability that smart eyewear brings to the workplace.

Outlined here are what I consider the top three use cases for wearable technology/AR in the automotive industry.  In this case the target audience is the employee or contractor within the organization, so these use cases could apply to other industries as well.

  1. Guided instructions

Adding digital or virtual content while in the real world to provide step-by-step instructions for procedures or workflows is a key use case.  This information could be as simple as text, images, or videos in monocular eyewear that is ‘glanceable’.  In certain environments like the plant floor or a warehouse facility, having a less immersive solution, sometimes called assisted reality, enables the information to be provided while the wearer maintains awareness of the environment around them.   A more immersive solution, typically for more stationary activities, can use binocular eyewear and augmented or mixed reality.  This can provide a digital information overlaid over a real-world object or an ‘underlay’ which provides an x-ray-like view into parts and subsystems within a fully assembled product.  Some use case examples include parts picking, inspections, assembly/disassembly and repairs.

  1. See-What-I-See/Remote expert

Using smart eyewear in conjunction with video/audio collaboration software can connect local users with remote experts to provide real-time guidance.  One advantage of smart eyewear is its hands-free nature that allows the local user to continue to work.  Some systems allow screen annotations to provide better visual clues for both parties. This could be low-tech, from marking up an image, to high-tech, which creates a 3D annotation in space that is locked in that position even if you change your view.  It has the potential to reduce travel and its associated expenses and decrease the time to resolve issues.  Some use case examples include facility or program launch/decommissioning and dealership service support.

  1. Design visualization

Visual 3D representation of a vehicle design can include physical prototypes made of clay or wood.  This can be expensive and time-consuming to create and modify.  Virtual reality can be used as an effective immersion tool, but augmented reality can add yet another dimension of realism to the process.  A vehicle buck (physical mock-up) could be overlaid with digital content, allowing the comparison of various designs and enabling real-time changes in attributes (color, size, etc.).   Digital ‘comments’ from a reviewer could be recorded in audio and text formats and anchored in the exact 3D location for later reference.  Some use cases include the design process as well as during design reviews.

While smart eyewear/AR technologies are still nascent, there is tremendous potential to change the way we work.  I also believe there are use cases for both monocular and binocular smart eyewear as well as the spectrum of augmented reality, from assisted to mixed.

 

*Randy will be speaking on a panel discussion around the applications for smart glasses and other head-worn devices in enterprise on Thursday, May 11, 2017 at the Spring Enterprise Wearable Technology Summit 2017 in San Diego, CA.

3 Great Use Cases of Wearable Tech for EHS

According to the most recent data from the International Labor Organization, every 15 seconds a worker dies from a work-related accident or disease. On top of 2.3 million deaths per year from occupational accidents, over 313 million workers suffer non-fatal work injuries. The great human cost also has an economic impact: For employers, on-the-job accidents cost billions of dollars annually due to production downtime and workers’ compensation fees.

Can technology help prevent work-related accidents and diseases? The majority of workplace injuries are easily preventable through real-time monitoring of workers. After all, connected workers – aware of (and sensed by) their environment through IoT technologies – are inherently safer.

Wearable technology can greatly improve workplace safety. For example,

  • Smart bands and sensors embedded in clothing and gear can be used to monitor workers’ health and wellbeing by tracking such factors as heartrate, respiration, heat stress, fatigue and exposure. Notifications can be sent to workers’ wearable devices when critical levels are reached.
  • Machine and environmental sensors can provide contextual information to field workers to help keep them informed and aware of their surroundings; and wearable GPS tracking can ensure they keep out of hazardous areas.
  • Smart glasses and other HUDs allow employees to access work instructions and manuals in the field, in addition to enabling remote guidance. This aids their productivity and makes them safer, since accuracy (doing a job correctly) and safety go hand-in-hand.
  • Camera-equipped wearables can also be used to document a job or incident for later review. Such data can be utilized for safety training and to identify safety issues in the work environment.

In addition to providing real-time safety information and alerts to workers, wearable devices make for a safer workplace simply by the way in which they are used, i.e. hands-free. There are some great real-world use cases of wearable technology for environmental health and safety. Read on to learn how three major enterprises are using wearables of different form factors to augment their safety efforts:

North Star Bluescope Steel

This steel producer is working with IBM on developing a cognitive platform that taps into IBM Watson Internet of Things technology to keep employees safe in dangerous environments.

The IBM Employee Wellness and Safety Solution gathers and analyzes sensor data collected from smart helmets and wristbands to provide real-time alerts to workers and their managers. If a worker’s physical wellbeing is compromised or safety procedures aren’t being followed, preventative measures can be taken.

North Star is using the solution to combat heat stress, collecting data from a variety of sensors installed to continuously monitor a worker’s skin body temperature, heart rate, galvanic skin response and activity level, along with the temperature and humidity of the work environment. If temperatures rise to unsafe levels, the technology provides safety guidelines to each employee based upon his or her individual metrics. For instance, the solution might advise an at-risk worker to take a 10-minute break in the shade.

With the IBM Employee Wellness and Safety Solution, data flows from the worker to the IBM Watson IoT platform and then to a supervisor for intervention/prevention. Watson can detect hazardous combinations from the wearable sensor data, like high skin temperature plus a raised heart rate and lack of movement (indicating heat stress,) and notify the appropriate person to take action. This same platform could be used to prevent excessive exposure to radiation, noise, toxic gases and more.

John Deere

John Deere, best known as a manufacturer of agricultural equipment and machinery, is using Virtual Reality headsets to evaluate and assess the “assembly feasibility” of new machine designs. Performing ergonomic evaluations in VR improves the safety of production employees by revealing the biomechanics of putting a proposed machine together. High risk processes can be identified and corrected before they pose a problem for the assembler on the shop floor.

In one of these VR reviews at John Deere, an operator puts on a headset and becomes completely immersed in a virtual production environment. Reviewers can see what the operator sees, and determine whether a potential design is safe to manufacture. They can see all the safety aspects that would go into assembling the product, including how the worker’s posture would be affected, whether there is chance of physical injury, what kinds of tools would be required, etc.

John Deere believes VR-aided design evaluations can result in less fatigue, fewer accidents, and greater productivity for its manufacturing team, and the method has already proven effective in reducing injuries at the company. Learn more about this use case at EWTS 2017, where Janelle Haines, Ergonomic Analyst and Biomedical Engineer at John Deere, will participate in an interactive workshop on “Leveraging Virtual Reality in the Enterprise.”

National Grid

The electricity and gas utility company is exploring wearable tech for lone worker health and safety. National Grid believes wearables can have multiple advantages in the workplace, including improving safety as well as speeding up the process of repairs and reducing costs. The ngLabs team is responsible for looking at the latest technologies; in one of its first projects, the team is focusing on the critical worker:

The project uses interactive wristbands developed by Microsoft to monitor the health, safety and wellbeing of workers who operate alone or remotely. The smart bands track location, measure vital statistics like heart rate, and enable remote/lone workers to send a signal to colleagues when they’ve arrived on site or checked out without having to make a call or fill out paperwork. Information is captured quickly, making it easier to spot problems and send alerts if something goes wrong.

Hear more about this use case in San Diego this May—David Goldsby, Technology Innovation Manager at National Grid, will present a case study on “Digital Disruption and Consumerization in Utilities” at EWTS ’17.

 

About EWTS 2017:

The 3rd annual Enterprise Wearable Technology Summit 2017 taking place May 10-12, 2017 in San Diego, California is the leading event for wearable technology in enterprise. It is also the only true enterprise event in the wearables space, with the speakers and audience members hailing from top enterprise organizations across the industry spectrum. Consisting of real-world case studies, engaging workshops, and expert-led panel discussions on such topics as enterprise applications for Augmented and Virtual Reality, head-mounted displays, and body-worn devices, plus key challenges, best practices, and more; EWTS is the best opportunity for you to hear and learn from those organizations who have successfully utilized wearables in their operations.