Technological and cultural shifts that result in enhancements in manufacturing tend to increase complexity in products and processes. In turn, this complexity increases requirements in manufacturing and puts added pressure on organizations to squeeze out inefficiencies and lower costs where and when feasible.
This trend is acute in aerospace, where complexity, quality and safety require a large portion of final assembly to be done by humans. Corporations like AREA member Boeing are finding ways to improve assembly workflows by making tasks easier and faster to perform with less errors.
Sourced through Scoop.it from: thearea.org
At ARise ’15, Paul Davies of Boeing presented a wing assembly study in collaboration with Iowa State University, showing dramatic differences in performance when complex tasks are performed following 2D work instructions versus Augmented Reality.
A Study in Efficiency
In the study, three control groups were asked to assemble parts of a wing, which required over 50 steps to assemble nearly 30 different parts. Each group performed the task using three different modes of work instruction:
A desktop computer screen displaying a work instruction PDF file. The computer was immobile and sat in the corner of the room away from the assembly area.
A mobile tablet displaying a work instruction PDF file, which participants could carry with them.
A mobile tablet displaying Augmented Reality software showing the work instructions as guided steps with graphical overlays. A four-camera infrared tracking system provided high-precision motion tracking for accurate alignment of the AR models with the real world.
Subjects assembled the wing twice; during the first attempt, observers measured first time quality (see below) before disassembling the wing and having participants reassemble it to measure the effectiveness of instructions on the learning curve.
Participants’ movements and activities were recorded using four webcams positioned around the work cell. In addition, they wore a plastic helmet with reflective tracker balls that allowed optical tracking of head position and orientation in order for researchers to visualize data about how tasks were fulfilled. Tracker balls were also attached to the tablet (in both AR and non-AR modes).