Recovery and guidance technology of collision network based on multi-source information fusion

To solve the problem of low precision of pose estimation and poor anti-jamming in the recovery process of small Shipborne UAV, a multi-sensor navigation scheme was proposed, based on vision / IMU / GPS fusion. Firstly, an integrated navigation scheme of multi-sensor fusion in collision recovery guidance was proposed. GPS is used for long-distance guidance. Vision / IMU / GPS fusion guidance is used in approach phase. GPS and visual information are fused based on extended Kalman filter to obtaion position information. IMU and visual information are fused based on least square method to obtain the optimal attitude information. According to this method, the optimal pose information is obtained to improve the guidance accuracy and the success rate of net collision recovery.The simulation results show that the guidance deviation is reduced by 30% compared with the unfused data.


Introduction
Because of its low cost, easy to use and easy to maintain, small Shipborne UAV is more and more popular. However, this kind of UAV has some disadvantages, such as light weight, small inertia, low flight speed and poor stability. When the ship is affected by the wind and waves and other complex and harsh recycling environment, it is difficult to recover by the conventional slide way, which is very easy to generate recycling accidents. Therefore, it is an ideal way to recover by installing nets. This recovery method requires the guidance law of UAV to have high precision and anti-interference ability.
In the approach and recovery phase of collision recovery, the accuracy and at-titude of UAV navigation is very important. If the position and attitude deviate, the UAV may not be accurately recovered to the recovery network, resulting in collision risk and even far away from the recovery device, resulting in recovery accidents. Single sensor landing navigation research has been more mature, has its own advantages, but also has greater deficiencies. IMU has good dynamic performance, but it is not conducive to long-term high-precision navigation due to the existence of accumulated errors. GPS has a large working range, the signal is easy to be disturbed, the visual information is more accurate, but it is easy to be affected by the surrounding environment. So the integrated navigation tech-nology has been paid attention to recently. However, in the technology research of small Shipborne UAV network recovery, the multi-sensor fusion guidance is rarely used. In this paper, GPS and vision fusion information are used to navigate UAV, and extended Kalman filter is used to fuse two different navigation infor-mation to obtain UAV position information. Vision and inertial navigation are used for data fusion to get the attitude information of UAV, so as to guide the recovery process of UAV collision network more accurately.

Vision / GPS / IMU multi system guidance scheme
In the process of recovery of collision net of UAV, the requirements of navigation accuracy are different in different stages. According to whether the UAV detects the target and the distance between the UAV and the recovery network, the multi-sensor fusion guidance scheme is adopted in this paper. The recovery process of collision net of Shipborne UAV is shown in the figure below.  Firstly, GPS is used to guide the UAV to the landing area. When the vision sensor detects the carrier aircraft and its recovery device, the UAV is about to enter the approach stage, and the requirements for navigation accuracy and guidance law accuracy become higher. Vision / GPS / IMU sensor fusion is used to obtain more accurate navigation data. GPS and vision fusion are used to get accurate position information, so that the UAV can accurately reach the recovery device for recovery, vision and IMU fusion can get more accurate attitude information, in the process of terminal guidance recovery, improve the accuracy of attitude control, so that the UAV can enter the recovery network safely.

Attitude data fusion based on extended Kalman filter
By fusing vision information with IMU information, IMU can provide acceleration and angular velocity information. The fused data can make up for the invalid defects of camera data in rapid movement, and can also reduce the data drift caused by IMU hardware itself in normal state. Information fusion between vision and IMU can be divided into loose coupling and tight coupling. Tight coupling refers to the use of each sensor to measure the original data, after processing to get an attitude information. The final data obtained by this fusion method is of high accuracy, but the algorithm is complex. Loose coupling means that each sensor inputs attitude information through attitude calculation and data processing respectively, and fuse the output results of each sensor. This kind of fusion algorithm is simple, easy to implement, and has strong expansibility. In this paper, a loose coupling fusion method is used to fuse the six degree of freedom attitude information obtained by vision and IMU through the extended Kalman filter algorithm. The algorithm flow chart is shown in the figure below：

IMU error model
The IMU consists of an accelerometer and a gyroscope. Accelerometers are used to measure the acceleration of objects, gyroscopes are used to measure the angular velocity of objects. In the process of using IMU, it will be affected by noise and the sensor itself will drift. Therefore, the measurement model of IMU is as follows: where: a ε is the specific force error of acceleration measurement, and ω ε is the angle error of gyroscope measurement.

Fusion algorithm based on extended Kalman filter
Kalman filter algorithm is suitable for linear Gaussian system, but in fact, many systems are nonlinear. The extended Kalman filter is an extension in the case of non-linearity. The core 2. Covariance prediction matrix: 3. Prediction error covariance matrix: 4. Status update matrix: 5. Covariance update matrix: According to the principle, the EKF fusion algorithm can be divided into two processes. One is the prediction process, the other is the update process. In the fusion system of vision and inertial navigation, the prediction process is realized by the equation of inertial navigation system, and the information obtained by vision is used as the observation value to realize the update process.
Suppose the state equation of the system is: The state transition matrix k Φ and the state observation matrix k H can be obtained from the state equation of the system. The state variables of INS include position, attitude, velocity, accelerometer bias and gyroscope bias. Therefore, the nominal state of the inertial system is

and the error state
where Q is the noise matrix： ( ) The prediction process of Kalman filter can be completed by the state transition matrix and error covariance matrix obtained above.
The result of monocular vision is used as the observation value to update the pose estimation.
w w c c z p q � � = � � Firstly, the gain matrix K of the system is calculated by ,and then the state vector and covariance matrix of the system are updated:

Guide data fusion based on linear constrained least square method
The position fusion of GPS and vision also adopts loose coupling. Using the least square method to fuse the data of two kinds of sensors, the least square method can minimize the error and find the best function matching of the data, and the accuracy of the data is high. There are two kinds of sensors in this system, so the measurement equation of this system is: where, T w is the weight vector, and the best fusion data can be obtained by solving the best T w . When 2 [( ( ) ( )) ] E z t s t − minimum is satisfied, the optimal condition can be satisfied and the optimal weight coefficient can be obtained.
Suppose w satisfies 1 T w a = , ( ) n t is zero mean Gaussian white noise, according to:

T T T T E s t w E n t n t w w E x t x t w +
= can be deduced, then the constraint condition of 2 [( ( ) ( )) ] E z t s t − minimum mentioned above is converted to However, in the actual project, the covariance of ( ) x t is not easy to obtain, so the mean value of ( ) ( ) T x t x t is used instead to obtain: Thus, when the number of measurements is close to infinity, the linear constrained least squares fusion solution converges to the minimum variance fusion solution with a probability of 1.

Simulation verification
Because GPS can only measure the position of the drone and cannot obtain the attitude information of the drone, GPS and IMU are used as a set of navigation data for the drone, and pure visual guidance is used as a group to compare with the fused data. Simulate the glide recovery section of the carrier aircraft. Figures 3 to 5 are the height of the drone from the target installation point during the recovery guidance, the deviation from the X axis of the set guidance trajectory, and the deviation from the Y axis. It can be seen from the figure that after data fusion, the deviation is reduced by 30%, and the navigation accuracy is improved.

Conclusion
The precise navigation of small ship-borne drones is the key to the recovery technology of installed nets. Because single sensors are used for guidance and the recovery accuracy is low, this paper designed multi-sensor fusion navigation guidance, using extended Karl filtering and least squares to many Sensor data is fused, and the advantages of each sensor are used to obtain more accurate navigation information. Finally, through simulation verification, this method improves the accuracy of navigation data and has certain engineering application value.