Framework for the Simulation of an Aircraft Final Assembly Line

. The EU ACCLAIM project as part of the European Clean Sky 2 activities is oriented to improve the aircraft assembly process. The SIMFAL H2020 Clean Sky 2 project is part of the ACCLAIM project, whose objective is to analyse, plan and optimize automated assembly tasks of cabin and cargo interior parts, with a coexistence between human workforce and machines (lightweight robots and AGVs). SIMFAL framework integrates VR and AR systems to help the user to work in limited spaces, collaborating with automation systems. The VR system aids the user to visualize different assembly processes in an immersive environment, and evaluate them in terms of time and ergonomics to choose the best one. The output of this system will feed the AR system, that will use the best process to guide the user though the assembly tasks, showing context-sensitive information about the tasks and the environment through a help assistant. This paper is focused on showing the first results of the implementation of a VR simulation of aircraft assembly tasks, based on the SIMFAL framework design. The proof of concept is tested with a simulation of six real scenarios and ergonomics experiments already done in Airbus facilities.


Introduction
Currently, the high demand and quality standards of products makes the manufacturing process more complicated for workers and industries. Corporations need to be responsive to more complex and demanding processes in order to be efficient and economically competitive. To cope with these requirements, the manufacturing industry needs to adapt its processes and manufacture products at minimal cost and time-to-market considering cost, time, quality and flexibility. As a result, the demand on workers to learn more operations simultaneously and learn new products more often increases [1,2].
As manufacturing processes get more complex and demanding, workers are less capable of reacting to sudden changes; this leads to negative effects associated to manual or statically automated work, for instance, quick operator fatigue, impaired decision making, degradation of manual skills, lower accuracy and longer task execution times among others [3].
The increasing use of automated systems has helped to improve the productivity of industries besides the health and safety of workers. Therefore, a future aircraft factory would be incomplete if automation technologies are not considered. This future is oriented to more flexible and adaptable systems, allowing shorter cycles, environmental friendliness, energy efficiency and higher integration; these are the Clean Sky 2 project goals [4].
Nowadays, many processes of aircraft manufacturing are still mainly done manually; also, many of the assembly operations are currently performed under non-ergonomic conditions and the process chains are complex and insufficiently transparent [5].
We present in this paper, the first advances of the SIMFAL (SIMulation of an Aircraft Final Assembly Line) project, part of the ACCLAIM's (Automated Cabin and Cargo Lining and hAtrack Installation Method) H2020 Clean Sky 2 project, to automate, coordinate and improve the aircraft assembly process of cabin and cargo parts with Human-Robot Coexistence (HRC) (Fig. 1). This paper is organized as follows: Section 2 gives an overview of the research made in the area of VR/AR simulation of assembly tasks. Section 3 describes the SIMFAL framework that combines Virtual Reality (VR) and Augmented Reality (AR) systems in order to enhance the assembly process for aircraft factories. The development of the VR simulation environment to test different assembly scenarios and HRC levels is explained in Section 4 meanwhile the VR simulations and the ergonomics assessment tool are shown in Sections 5 and 6 respectively. Finally, in Section 7 the conclusions and future work are presented.

Related Work
One area that leveraged the development of VR/AR simulation technologies is the industrial assembly. We will mention the research made on this area with VR and AR technologies, divided according to the agents involved in the assembly process (human-driven or HRC).
As part of the VR human-driven assembly research, [6] focuses on the development of a simulation in Unity for riveting the wing of a plane, using the Oculus Rift and Microsoft Kinect II. It also makes an ergonomics assessment using the REBA and RULA protocols. In [7] a semi-immersive VE to assemble an airplane turbine engine was developed. In [8] a proposal of a VR workshop to do assembly planning with VR assistance was presented.
In the field of Human-Robot (HR) collaborative assembly in VR, [9] addresses the advantages, disadvantages and open questions regarding the use of VR for the simulation of HRC in production engineering and explains the development of a virtual collaborative robot manipulator in Unity that can be controlled via HTC Vive. In [10] the coupling of a real robot manipulator with the virtual one was achieved. BeWare of the Robot v1.0 and v2.0 [11,12] presents a VR training system developed using the Unity engine, Oculus Rift DK2 and Microsoft Kinect where assembly tasks are made by a robot and human collaboration is needed for removing a backing film of a piece and laying fabric into a mold. In [13] a VR teleoperational interface for a ROS-based ground contact robot made in Unity was developed.
The AR research in the area of human-driven assembly is focused on giving the operator instructions to assemble parts and detect if the parts are correctly assembled. [14] presents a simulation where an operator may assemble virtual parts over a real automobile engine; in [15] simple tasks as picking, positioning, screwing and tightening small parts is achieved and tested; an AR training application to assemble a gully trap was developed in [16]; In [17] the concept of process value visualization in AR of relevant information of an assembly line is introduced. Finally, in [18] an algorithm to generate AR instructions based on the product and process semantics was developed and tested.
In the area of HRC in AR, [1] developed an Android system with Unity to build a car toy by following on-screen instructions with different automation levels. The main objective of SIMFAL is to analyse, plan and optimize assembly tasks of cabin and cargo interior parts with a HRC. To do that, a VR/AR system to test alternative process scenarios is being developed. As it can be observed in Fig. 2, the framework will enable the comparison (in terms of productivity, ergonomics, and other criteria) between the classical procedure and simulations. The system covers the following parts:  VR Environment: Is a training simulator of the aircraft environment that allows the worker to virtually assemble the interior parts of the aircraft with the collaboration of a robot in different automation modes. Thus, different assembly plans (100% of work done by the worker, 100% of work done by the machine or co-operation; and different process sequences) can be tested and evaluated in order to improve the process in terms of productivity and ergonomic conditions of the worker. The best assembly plan will feed an AR system. It is important to note, that this VR environment for assembly planning is already done.  AR Assistant: This application's assists the worker while assembling the aircraft parts, by displaying instructions and context-sensitive information of the environment and the automated devices in real-time. The system will be able to track the parts and show visual cues to help the worker to position them.  Automation system: Is comprised by all the automated devices (like the lightweight robots, AGVs and exoskeletons) used to aid the assembly processes.  Communication Middleware: The VR application until now is a standalone Unity simulation of the aircraft environment, controlled by the worker through the HTC Vive controllers and voice commands. For the AR application, an OPC-UA server will manage the data from all the automated devices (ROS nodes) to give information about the position, rotation and status to the worker. The communication of the worker with the application will be though voice commands and hand gestures. To allow the voice commands recognition, the Microsoft Cortana voice recognition system will be integrated to the VR and AR systems.

The VR Simulation of Cabin and Cargo Assembly Tasks
The first milestone of SIMFAL project was the VR simulation environment developed using the Unity engine. This part describes the workflow for importing the models into Unity, the configuration of the robot manipulators and AGV, and the help assistant.

Workflow for Model Preparation
To create the simulation environment, the models needed were the aircraft parts ( Fig. 3(a)) and the robots (Fig. 3(b)). Most of these models are created for industrial manufacturing using CAD data exchange formats, like the STEP file format. The presence of non-visible shapes, for instance, internal parts of a robot, has an impact on the real-time visualization and interaction. To overcome this problem, these parts were deleted using FreeCAD software. Afterwards, to avoid the conversion of the STEP file to an intermediate format as it is time-consuming, may involve multiple formats and further steps [20], a Unity plugin for importing it directly was developed using the Open CASCADE library. This plugin allows to control the level-of-detail and pose of the model, and to decide if the conversion should create separate objects in Unity for each subassembly, or a single one for the entire model.

Navigation System for the AGV
For the virtual AGV, a controller to take care of the movement and the ability to attach the racks was developed. For movement control, we leveraged the Unity's built-in pathfinding system to make the AGV intelligently move through the cabin to reach different target places without colliding with the objects in the scene.

Robot Manipulators
After importing the models of the automation systems, they are arranged as hierarchical objects, and then a script with inverse kinematics algorithms is added to each one to simulate its real movement.

Help Assistant
A help assistant (Fig. 4) was developed to guide a worker through the assembly process, reporting the worker about the following topics: The assembly task to do, warnings/alerts, automated devices' status, ergonomics alerts and other system status (for instance, Hololens battery or network connection). It incorporates a message manager with a queue to preserve temporarily the messages of the tasks and devices and provides an emergency interface to be called in case of malfunction, impossibility to accomplish a task, among others. This assistant was adapted to the visualization system used (HTC Vive or Microsoft Hololens) as they have different specifications, for instance, the field-of-view, and it is present by request through a voice command whenever the worker needs it.

VR Scenarios
Six assembly scenarios (chosen because of the ergonomics problems they currently have) have been simulated: The fully-automated assembly of panels, hatracks and linings and the human-robot assembly of the same parts. Three automated devices are used: (1) The KUKA LBR Iiwa 14 lightweight robot to help in assembling the sidewall panels and linings), (2) an empowering arm (serial manipulator with one passive degree of freedom and three actuated degrees of freedom) to help in assembling the hatracks and (3) The HStar AMP-S AGV.
In the fully-automated scenarios, the AGV moves the lightweight robot and the first shelf with the sidewall panels to the cabin, places each sidewall panel, and when shelf is empty the AGV looks for the next one until all the sidewall panels are placed. To place the linings in the cargo, the process is the same. To place the hatracks, the empowering arm is used.
The human-robot assembly scenarios are similar to the fully-automated ones but with HRC. This collaboration means that for the sidepanels and linings the lightweight robot grabs them and places them while the human pushes them to do the final adjustments. For the hatrack assembly, the human controls the empowering arm for the lifting and placing.

Ergonomics Assessment
Ergonomics is one of the goals of SIMFAL project because currently the operators need to assemble parts in small spaces doing non-ergonomic postures (Fig. 5). Some parts are heavy, therefore, difficult to lift and may result injuries or even accidents. The main idea of the ergonomics assessment is to detect how these tasks are currently performing, how they will perform with automation in VR, compare the scenarios and improve the process.

Ergonomics Assessment Tool
To achieve the ergonomics assessment, a tool for automatic analysis and visualization of motion data was developed to integrate data captured from Noitom's Perception Neuron MoCap system into the VR platform used in SIMFAL project. This ergonomics tool incorporates the Ovako Working Analysis System (OWAS) which evaluates physical stress during a job task [22], and allows visualizing the worker movements and the risk of the current posture by calculating it instantly and color-encoding the avatar as shown in Fig. 6. OWAS establishes 252 posture combinations according to the position of the back (4 possibilities), arms (3 possibilities), legs (7 possibilities) and load magnitude (3 possibilities), that are codified and assigned a risk category (Fig. 7). 1 No corrective actions

Airbus Experiments
A study of traditional assembly procedures for cabin interior parts in a FAL was carried out during a 2 days stay at Airbus plant in Hamburg. To perform this study, an A320 aircraft was available, and the data of 5 workers with different characteristics (Table 1) to carry out the assembly work was taken with the Noitom's Perception Neuron MoCap system. The 18neuron configuration of the MoCap was selected. It includes: 1 neuron for the head, 4 neurons for each arm (shoulder, arm, forearm and wrist), 2 neurons for the back, 3 neurons for each leg (thigh, shinbone and foot), 1 neuron for a tool (not used). Each neuron contains a gyroscope, accelerometer and a response framerate of 120 fps. The motion data was captured at 10 frames per second and saved to a JavaScript Object Notation (JSON) file. And then loaded into the ergonomics tool for analysis. Preliminary results using the OWAS method are shown in Table 2. For each worker, the left column has the number of repetitions of each task and the right column the associated color-encoded risk. For the dado panels, the risk for all workers is 2 as the task requires the workers to be kneeling. The sidewall panel has different risks because the workers had different assembly styles, being the worse the bent/rotated back. In the case of the hatrack, although it is the more complex and heaviest element for assembly, as the task is done by three workers the risk is low. A way to improve these results would be to also take into account other parameters, for example, the number of workers involved in the process. To assembly the PSU the workers are standing, so the risk is low. The ceiling panel has a greater risk for smaller workers as they have to use a ladder and have both arms up.

Conclusions and Future Work
This paper presents the first results of the SIMFAL project, whose objective is to analyse, plan and optimize automated assembly tasks of cabin and cargo interior parts with a coexistence of human workforce with automation systems in a limited space. A novel framework combining VR and AR with automation systems (lightweight robots and AGVs) has been proposed.
A VR simulation environment for cabin and cargo assembly tasks was developed in Unity, and allows to test and compare multiple assembly processes, that may have different levels of automation (fully-automated or half-automated). The HTC Vive and Perception Neuron MoCap systems are used to interact with the VR environment, and capture the motion data for ergonomics assessment respectively.
A visual tool to do the ergonomics assessment of the worker's postures based on the OWAS was developed and explained.
In the near future, an AR system based on Microsoft Hololens will be developed. This system will aid the worker through the assembly process by using the help assistant and showing contextual cues on the screen. It also will be able to recognize and track in real-time the cabin and cargo assembly parts according to its CAD models.
All the development made so far has been thought for future works using the modular principles, so most of the VR modules can be used for the AR system as well.