Mobile Robot Positioning by using Low-Cost Visual Tracking System

This paper presents an application of visual tracking system on mobile robot positioning. The proposed method is verified on a constructed low-cost tracking system consisting of 2 DOF pan-tilt unit, web camera and distance sensor. The motion of pan-tilt joints is realized and controlled by using LQR controller running on microcontroller. Without needs of camera calibration, robot trajectory is tracked by Kalman filter integrating distance information and joint positions. The experimental results demonstrate validity of the proposed positioning technique and the obtained mobile robot trajectory is benchmarked against laser rangefinder positioning. The implemented system can successfully track a mobile robot driving at 14 cm/s.


Introduction
Various techniques can be applied for mobile robot positioning [1].For the relative position, odometry or intertial sensor detect and estimate robot motion based on kinematic or dynamic robot model.One drawback is that when robot motion is unsmooth, especially in rough terrain, obtained trajectory often contains uncertainty.Visual based positioning is a possible solution for this problem.
Visual servo control has been developed since last two decades [2].The problem of eye-in-hand configuration on robotic system running on high speed controller [3] was implemented as image based visual servoing (IBVS) [4][5].As for moving object tracking, the target tracking was addressed for robot manipulators in [6][7].Later, solutions without needs of camera calibration are proposed.Different methods were presented as camera parameter independent or uncalibrated camera [8][9].Further improvement of motion control by estimating depth of feature in image was proposed [10].For pan-tilt mechanism, the target tracking was also addressed such as controller design for AC-servomotor visual tracking system [11] and real time position estimation of a constant speed moving object [12].
In this paper, IBVS is applied for mobile robot tracking.The idea is to reduce complexity of acquiring image data and depth information.The camera is utilized without calibration.Also, instead of esimating depth of feature in image as in [10], the distance between camera and the mobile robot is measured directly by using laser sensor mounting above the camera.As robot usually drives with constant speed, our problem is similar to [12] except that the pan-tilt unit is constructed using low-cost hardware and controller is implemented on Arduino.This paper is organized as follows: In section 2, the hardware and software of target tracking system is introduced.After that the motion control and estimation of robot trajectory is explained.Section 3 presents experimental results.Finally, conclusion is given.

Target Tracking System
In this work, a camera system is built and the state space model is identified.To validate the system, a standard LQR controller is selected for motion control.

Hardware components
As shown in Fig. 1

Software
The image is implemented by using OpenCV and Codeblocks.The HSV color-based recognition yields background subtraction on white-black image.Then, position of object is taken as center of foreground object.
The position of object in image frame is sent to Arduino and sensor data is obtained at every 80 ms approximately.
In parallel, LQR controller is running on Arduino.

Kinematic model
According to Denavit-Hartenberg notation [13], parameter are listed in Table 2 for the link transformation matrix according to Fig. 1. where

State space model
The dynamic equations of n-DOF manipulator [13] are derived from Lagrangian equation, the Lagrange-Euler dynamic is formulated as and is derived into vector matrix notation: where ‫,)ݐ(‬ ̇(‫,)ݐ‬and ‫)ݐ(̈‬ are ݊ × 1 vector of generalized joint variables (displacements), velocities, and accelerations for joints i, respectively.‫)ݐ(߬‬ is ݊ × 1 vector of generalized torques applied at joints i. ()is × inertial acceleration symmetric matrix whose elements are ‫ܯ‬ ݆݅ .(̇, ) is ݊ × 1 nonlinear Coriolis and centrifugal force vector.() is ݊ × 1 gravity loading force vector.Eq. ( 6) is the state space equation because the term (̇, ) has both position and velocity dependence.
The state space model of the pan-tilt unit is where

LQR controller
The feedback control law of LQR controller is that minimize the performance index [14] When (A-BK) is a stable matrix, there exists positivedefinite matrix P that is a solution of The minimization of J with respect to K gives The feedback gain K can be obtained by choosing The different between object and center of image frame is pixel error in image.This error is used as a set point for the LQR controller as shown in Fig. 2. When tracked object stays at the center of image, the set point is zero.Based on the proposed tracking method [15], the state variables are estimated by motion of joints and distance to the ball measured by laser mounted on the pan-tilt unit above the web camera.Laser frame rotates in pan and tilt directions as shown in Fig. 3.The reference frame is above ground level and a ball is mounted on the mobile robot.The laser frame is above the reference frame at height h l .The distance d is measured along y L to the ball having radius r.Note that the ball is use to represent robot position and simplify image processing.

Trajectory estimation
The vector to the ball on the reference frame, P 1 , can be calculated by using composite transformation from frame 2 to frame 1 as follows.
The vector P 2 to the ball center in frame 2 is and transformation matrix between the base and the laser frame is According to Eq. ( 12) -( 15), the coordinate of the robot is 3 Experimental results above the camera.The ±30 mm accuracy Hokuyo URG-04LX-UG01 is put at the same height as the ball center and tracked robot position is referred as actual trajectory.
From beginning, the robot drives straight on, turns right at corners and returns to starting point.Fig. 5 illustrates robot trajectory obtained from measured data, estimated trajectory, and the actual trajectory.The robot starts at (0,0) and returns to this point.Due to delay time in communication between the laptop and Arduino and noisy measurement as well as property of LQR controller, even the estimated trajectory is close to the actual trajectory, it has some deviations.However, with respect to low-cost components, the result is acceptable.Fig. 6 shows noisy measured value obtained from laser distance sensor and its filtered value.The range of measured distance in the experiment is from 2 to 4.5 meters.Fig. 7 shows position of pan and tilt joint.Starting from zero degree, the peak pan joint position is almost 37 degrees on the third corner before turning back to the starting point.The maximum tilt joint position is 14 degrees when robot is driving far along x-axis.

Conclusions
The low-cost target tracking system for mobile robot positioning was proposed, constructed and successfully implemented.The experimental results demonstrate tracked trajectory of a mobile robot driving on ground at constant speed of 14 cm/s.The system can keep tracking object at the center of image frame all the time.This method can be applied for tracking ground and also flying mobile robots.The maximum tracking range is adjustable by camera height configuration.Future work is to improve pan-tilt motion control performance and object recognition for mobile robot.
By model linearization, the transition matrix and the input matrix are:

Figure 2 .
Figure 2. The diagram of implemented feedback control.

Figure 3 .
Figure 3. Relationship among the Base frame, the Laser frame and the robot position.

Figure 4 .
Figure 4. Experiment setup.The mobile robot positioning is tested by setting up experiment as shown in Fig.4.A 18 cm diameter ball is mounted on a line tracking mobile robot driving with constant speed of 14 cm/s.The test field is made of white paper with black tape as a rectangle of 210 cm x 190 cm size with round corner.The web camera is at 120 cm height pointing downward to the ball and the distance sensor is 3 cm

Figure 6 .
Figure 6.Measured and filtered distance d.

Fig. 8
Fig.8shows horizontal pixel error e h and Fig.9shows vertical pixel error e v between object and center of image frame.In the image, the tracked object stays at center in both directions with standard deviation of 4.16 and 4.61 cm in horizontal (pan) and vertical (tilt) direction, respectively.

Table 1 .
The total cost is approximately 349 US Dollars.
x Intel core i7-4710Q CPU Labtop upto 3.50 GHz with 4 GB RAM running on Windows 8.1 64 bit.The system is mounted on a general camera stand for convenience of testing configuration.Cost of parts is listed in

Figure 1 .
The constructed low cost camera system for mobile robot positioning.

Table 1 .
Cost of the constructed camera system.