Design of control system and motion analysis of vision capture for small hexapod robot

Aiming at the difficulties in motion control and performance analysis of hexapod robot, stm32 embedded processor is used to drive the 18-channel steering gear control board to control the robot's motion. CAM Shift algorithm is designed to track the centroid target of the moving robot and calculate the motion speed. The gait experiment and performance analysis of straight line and steering were carried out on the self-developed prototype. The results show that the design integrates the machine vision tracking technology and the motion control technology of the hexapod robot, which can realize the control and motion analysis functions.

1 Design of motion control system 1

.1 Hardware design
The small hexapod robot studied in this paper has 18 active degrees of freedom. The effect is shown in Figure 1. When the robot moves on the platform, there are a large amount of data transmitted [7]. The control system needs to transmit the gait information through TXDX to the steering gear control board so that the steering gear can operate according to the set action. The computer transmits the image information captured by the front camera of the robot in real time through the FPV5.8G high-power picture transmission. Through HC-05 Bluetooth, STM32F103RCT6 communicates with the upper computer programmed on the computer, and the robot motion information and environment information acquired by the main control board STM32F103RCT6 are uploaded to the upper computer, and at the same time, through clicking the button on the host computer and connecting XBOX controller, the robot can switch between the sport mode and the motion mode set by the host computer. The overall block diagram of the control system is shown in Figure 2. The main control board of STM32F103RCT6 in the control system receives the information of the sensor system as follows: (1) Real-time information of MPU 6050 gyroscope acquired by MCU. (2) Obstacle information measured by three-way HC-SR04 ultrasonic ranging module in the distance from 2 cm to 5 m in front, left front and right front. (3) Temperature and humidity information of the environment measured by DHT11 temperature and humidity module.
Referring to the real size proportion design mechanism of the white-fronted tall spider, the components are made of nylon and aluminium alloy. According to MPU6050 gyroscope, HC-SR04 ultrasonic ranging module and other sensors, gait optimization is carried out. When there are large obstacles, the optimal path is determined independently, and if encountering small obstacles, it will go straight. The robot's physical view is shown in Figure 3.

Software design
Visual Studio is used to write the upper computer, and the communication protocol between the upper computer and the lower computer is formulated by the operator. The baud rate is 115,200. The frequency of sending data from the lower computer to the upper computer is 2 times per second, totaling 25 bits, all of which are sent in the form of characters. When the upper computer continuously detects two # and start reading data. Upper computer receives action commands and sends to the lower computer, a total of 1 bits, which is transmitted in the form of characters, such as going forward is indicated as "g", going left as "l", going right as "r" and stopping as "s". Through triangular gait and pentapod gait, the left-turn, right-turn and forward gait are realized respectively.
Then the development of XBOX and keyboard control is completed, which can switch the motion mode on the upper computer and the robot can move according to the motion mode set by the upper computer.

Image processing
In this paper, motion capture with labeled points is selected. The labeled points are in color [8]. Therefore, image processing and research can be carried out using the hue represented by HSV color space H [9]. After transforming RGB image from three original parameters to HSV, it is captured by H chromaticity alone. The three-dimensional RGB value is reduced to the one-dimensional of H chromaticity, and the original three-dimensional RGB color space is reduced to a single color.The conversion formula from RGB color space to HSV color space is as follows: (1) This formula will be used in the overall process of 2.2 motion capture. The range of HSV is: V: 0-1; S: 0-1; H: 0-360.

Motion capture
Machine vision technology is to give the robot the function of human eyes [10]. The image is transmitted to the controller [11] through the image acquisition system, and the image is judged by the core controller [12]. In the process of acquisition and transmission, images are often disturbed by external noise [13] to varying degrees . The disturbed images are blurred. Therefore, image noise needs to be processed after image acquisition, so as to protect the important details of the image [14]. Figure 4 shows the CAM Shift process. ROI (region of interest) refers to a special processing area slected in the image. Fig. 5 is the overall procedure of visual capture in the experimental program. There are eight steps in total. Each step is to process a frame of image, which requires the same processing for each frame. Steps: (1) Open the camera for image data acquisition.
(2) Denoise acquired image. The Gauss filter is processed followed by the median filter.  The trajectory is drawn according to the change of the central coordinate position of the capture frame. After excluding the case that the capture frame cannot surround the marker points, it is necessary to ensure that the fluctuation of the central coordinates does not affect the trajectory display. The trajectory of the actual experimental results can be smoothly displayed by following the center of the object.

Motion performance analysis
According to visual capture, the motion performance of the robot is analyzed, that is, the moving speed, straight line walking ability and steering performance of the robot are analyzed. The overall performance of the robot is shown in Table 1. The small hexapod robot is powered by 11.1V 3C 3300mAh aeromodelling battery. When the power supply is slowed down five times, the aeromodelling battery can meet the requirement. The average duration of the small hexapod robot increases from about 30 minutes to 50 minutes. According to the experimental data, the difference between the average speed of straight line walking and that of steering walking is small, that is, 0.44 cm/s.  Figure 6 shows selected 5 groups of straight-line trajectory of the foot robot . According to 50 sets of acquisition data, the maximum plane coordinate offset range of the robot is about 1.8-2.0 cm. When the capture frame is stationary, the floating deviation of the center plane coordinate is 0.1-0.3 cm, and the maximum coordinate offset is about 1.5-1.8 cm.
According to the straight-line walking of the robot, the maximum plane offset of the center point is smaller, so the straight-line walking ability is better.    After several groups of steering tests, the average radius required for turning 90 degree to left and to right is 58.7 cm and 53.6 cm respectively. The radius difference required for turning 90 degree to left is about 5.1 cm. The left-right steering ability of the robot has little difference.

Capture performance analysis
In practice, the capture frame will have a certain degree of jitter, and the central coordinates of the capture frame will also jitter. After testing in 50 different positions , Table 2 is 10 sets of data selected from image coordinates and table 3 is 10 sets of data selected from plane coordinates.   The jitter range of image coordinate is 0.5-1 cm in X-coordinate and 0.5-1 cm in Ycoordinate. The jitter range of plane coordinates is 0.1-0.3 cm in X-coordinate and 0.1-0.3 cm in Y-coordinate. So the measured image coordinate of the captured object have an error of 0.5-1 cm, and the plane coordinates of the captured object have an error of 0.1-0.4 cm. Robot vision technology is a non-contact measurement technology [15] . The moving speed of the captured object in the range of the camera has an effect on the visual capture. Therefore, 30-40 times of different speed tests were carried out to determine that the limit capturing speed is approximately 10 m/s. The capturing effect is not good when the capturing speed is between 6-10 m/s, and the capturing effect is the most stable during the period of 0-6 m/s. The visual capture effect basically adapts to the motion capture of the foot robot and achieves the capture effect.

Conclusion
In this paper, an embedded processor is used to control the steering, going forward and backward of a hexapod robot efficiently. The CAM Shift algorithm is used to realize the motion capture and performance analysis of the foot robot. The upper computer can display the locus of motion and the position of the target coordinate, the displacement and velocity, the processing time of the program, etc. The application of tracking and acquisition technology in machine vision to hexapod robots has far-reaching application value for the application of machine vision technology to intelligent robot control.