Você está na página 1de 6

Proceedings of the International Conference on Robotics, Vision, Information and Signal Processing ROVISP2007

Odometry Error Model for a Synchronous Drive Robot

Munir Zaman, John Illingworth
Faculty of Engineering and Physical Sciences University of Surrey, Guildford, GU2 7XH, U.K. Tel: +44(0)1483 300800, Fax: +44(0)1483 686031, E-mail:[m.zaman, j.illingworth]@surrey.ac.uk Doh et al. [5] who derived a kinematic model to explain this. The model was validated with a limited set of real data. In this paper a systematic error model, based on comprehensive empirical observations of the robot during translational and rotational motions is proposed. This model was arrived at independently [7], and validates Dohs model for translational motion. Results from real data show a significant reduction in the systematic error. The Synchronous Drive Robot

Wheel odometry is a common method for mobile robot relative localisation. However, this method is known to suffer from systematic errors. In this paper a comprehensive systematic odometry error model for a synchronous drive robot is proposed. The model addresses the systematic error for both rotational and translational motions. Results on real data show a real potential to produce a significant reduction in the odometry error for translational motions.

Odometry, Error Model, Synchronous Drive

Wheel odometry, also known as dead reckoning is a common method for relative localisation in a mobile robot. However, odometry is known to suffer from systematic errors. Initial research on correcting systematic odometry errors were based on modelling the kinematics of differential drive (DD) robots (eg., [1][2]). The kinematics of a synchronous drive (SD) robot differ fundamentally from a DD robot. Odometry error models based on the kinematics of the DD robot cannot be directly ported to SD robots. Despite both types of robots providing two odometry data streams from two control motors, SD robots have a control motor exclusively for effecting translation by rotating the wheels along the ground. In addition, a separate control motor rotates the wheels around a vertical axis and synchronously rotates the upper carousel section, thereby changing the direction of forward motion with a corresponding change in the heading. SD robots therefore have two control motors effecting rotation and translation independently of each other. In comparison the two control motors in a DD robot effect only translation of the wheels i.e. both odometric data streams relate to the travel of the wheels along the ground. Changes in heading are effected by differing the amount of travel between the wheels. In 2001, Martinelli proposed an odometry error model for a synchronous drive robot [3]. Further theoretical developments were presented in [4]. Still, the model was not validated using real data. More importantly, this and other works (eg., [6]) failed to observe that the curvature of the robot trajectory during translational motion was a sinusoidal function of the heading. This omission was recognised by

Figure 1 A Synchronous Drive Robot Figure 1 is the synchronous drive robot used. The robot was built by Surrey University on a commercial synchronous drive base. Figure 2 is a schematic view of the kinematics, showing how rotation by is effected in a synchronous drive robot in comparison to a differential drive robot. The orientation of the base and the position of the wheels in a synchronous drive robot do not change, instead the three wheels rotate synchronously with the carousel, around a vertical axis. This can be compared to the kinematics of a differential drive robot where rotation is effected by translating the wheels by differing amounts, or for spot rotation by translating the wheels an equal amount in opposite directions. Therefore rotations in a differential drive robot require a translation of the wheels along the ground. Translational motion for both SD and DD robots are effected by rotating the wheels along the ground.


Proceedings of the International Conference on Robotics, Vision, Information and Signal Processing ROVISP2007

(a) Synchronous Drive

(b) Differential Drive

Figure 2 - Effecting Spot Rotation in Synchronous and Differential Drive Robots The pose of the robot is expressed as (x, y) for position and (h) for heading. The heading is represented by the sum of two variables: (i) h (the orientation of the carousel with respect to the base), and (ii) b (the orientation of the base with respect to the World). Ordinarily, b is not expected to change unless there is a systematic error, however, this value does change during translational motion, affecting the heading. Heading Error which is the difference between the heading estimated from rotation odometry and the true change in heading. As the carousel rotates relative to the base, a datum is required to be set where a relative orientation of the carousel to the base defines h = 0. A pointer affixed to the carousel pointing to a fixed marked position on the base was used. Rotation Error Model

The Odometry Error Model

The odometry error model equations consist of two parts one for rotation and the other for translation odometry (see Table 1). Table 1- Error Model Equations ROTATION f ( odo ; S ) = S odo
f x ( h ; Ax , x , K x ) = Ax sin( h + x ) + K x

The model considers a scaling error (S) to correct the odometry estimate of the rotation of the carousel to the true change. This parameter was estimated by comparing rotation odometry with the ground truth of the rotation of the carousel (see Figure 3(a)). Observations of the robot during spot-rotational motion show that the displacement describes a near circle. The displacement is modelled as sinusoidal functions fx and fy respectively. The parameters are estimated by fitting ground truth (x,y) displacements to fx and fy (see Figure 3(b)), with reference to the pose at h = 0. There is no observable change in the orientation of the base with respect to the World during rotational motion. This aspect need not be modelled. Hence the rotation can be modelled by f. Translation Error Model Modelling the translational systematic error is more complicated. The robot path describes an arc, which varies with h. There is also a scaling error between the true distance travelled and the translation odometry. This scaling error is modelled by fd. The true distance traversed considers the distance travelled along the arc. The theory of the format of the function for modelling position and heading error for translational motion is explained as follows. A synchronous drive robot will tend to curve during translational motion, with the degree of curvature depending upon the orientation of the wheels (i.e., h). As the orientation of the wheels changes during rotational motion the degree of curvature during any subsequent translational motion will be affected. The ground

f y ( h ; Ay , y , K y ) = Ay sin( h + y ) + K y
TRANSLATION f d ( d odo ; S d ) = S d d odo
f b ( h ; Ab , b , K b ) = Ab sin( h + b )+ K b f c ( h ; Ac , c , K c ) = Ac sin( h + c ) + K c

The two inputs to the model are rotation and translation odometry, denoted as odo and dodo respectively. In addition, there are three error components for each of the two parts: Conversion Error which is the scaling error in the software transforming translation odometry to a translation in units of metres, or rotation odometry to a rotation of the carousel in units of degrees, Position Error which is the difference between the change in position estimated from odometry and the true displacement, and


Proceedings of the International Conference on Robotics, Vision, Information and Signal Processing ROVISP2007

(a) Scaling Error (for f)

(b) Position Error (for fx, fy)

Figure 3- Estimating Rotational Error Model Parameters

work laid down in [5] which describes a force model based on the kinematics, is expanded and explained in further detail here, supported by real data. This is used to support the error model equations derived from empirical observations of the robot during translational motion.

The following assumptions are made, without loss of generality on the format of the error model equations: 1. The contact points of the wheels on the ground form an equilateral triangle. 2. The centre of the triangle is the origin and the centre of mass. Applying the lever law, the torque, denoted Ft, is:
Ft = r1F1 + r2 F2 + r3 F3

where r1 = D sin( h + 1 ) , r2 = D sin( + h + 2 ) and 3 r3 = D sin( h 3 ) . Now, assuming that the traction 3 forces ( F. ) are equal, applying the addition formula to expand the terms, and then converting cosine terms to sines, Ft can be expressed as the sum of 5 sine terms:
Ft = FD{sin( h + 1 ) 1 sin( h + 2 ) + 2
3 2 3 2

sin( h + 2 + ) 2

sin( h + 3 + ) 1 sin( h + 3 )} 2 2

Figure 4- Drive Wheel Torque Force The forces causing the robot to curve are considered to be:

Now applying the superposition formula to each of the sine terms (i.e., C sin( + ) = A sin + B cos , C = A2 + B 2 ) and tan =

A centripetal drag force (Fc), and A twist of the base from a torque force (Ft).

) and rearranging:
i =1,5

Ft = FD{sin h ( Ai ) + cos h ( Bi )},

Centripetal Force (Fc). The centripetal force is simply the sum of the component of forces due to wheel misalignment, denoted as n where n = 13 (see Figure 4). The centripetal force is therefore
Fc = 3F sin n


where Ai and Bi correspond to the A and B terms in the superposition formula. The superposition formula can be applied again to convert the sum of a sine and cosine to a single sine function, thus:
Ft = FDC sin( h + )
B where C = ( Ai ) 2 + ( Bi ) 2 and tan = Ai . i

where F is the traction force (assumed equal for all wheels).

Torque Force (Ft). Now the torque around the centre of mass is considered. The twist of the base is caused by the sum of torque forces from each wheel around the centre of mass.


The resultant sum of the forces Fc and Ft, contribute to the


Proceedings of the International Conference on Robotics, Vision, Information and Signal Processing ROVISP2007

(a) Base Twist (for fb)

(b) Path Curvature (for fc)

Figure 5-Estimating Translational Error Model Parameters

systematic error in heading and the curved trajectory. Note that Fc is not affected by wheel orientation. From equation (2) the base twist contributing to the heading error is modelled as a sinusoidal function (ie. fb). This model is supported by real data that estimates the parameters of fb by curve fitting (see Figure 5(a)). Now the path curvature corresponds to the sum of the forces Ft + Fc, noting that Fc is a constant. The model function fc, is therefore also sinusoidal function (see Figure 5(b)). This is supported by real data for estimating the parameters through curve fitting (see Figure 5(b)).
A Localisation Algorithm (b) Compute dr from Translation. Let dr[translation] = Ttrdt and
cos h sin h Ttr = 0 0 sin h cos h 0 0 0 0 0 0 , 1 0 0 1

sin ( 2 ) cos( 2 ) dt = 0 ~ d f ( ) b h

~ ~ where = d f c ( h ) and = d sinc(/2).

4. Estimate Robot Pose and State. Update the robot pose P and heading offset b:
P(k+1)= P(k) + Tr2w(k) dr, b(k+1) = b(k) + b .

A localisation algorithm based on the proposed model equations is described here. 1. Initialise. Define the robot pose P(k),and Tr2w(k) at time k as:
cos b P(k) = (x, y, h)T , Tr2w(k) = sin b 0 sin b cos b 0 0 0 0 0 1 1

A total of 6 run ranging from 1.7m to 3.5m at various wheel orientations (h) were conducted (see Table 2). Runs 1,2,3 and 4, where the odometry error is the greatest, are at wheel orientations where the path curvature is greatest. The results show that the proposed model provides a significant reduction in the net position and heading error. In the case of runs 5 and 6 where the curvature is lower, the proposed model provides a marginal improvement if any.
Table 2- Residual Localisation Errors
Odometry Error h 1 340 340 170 170 80 260 Distance (m) 1.707 3.522 1.804 3.519 2.994 3.490 Model Error

where (x,y) is the position, h the heading, and b the heading offset (initialised to zero). ~ ~ 2. Input Data. Let f and d f d . 3. Compute dr. The state change vector d r is derived separately for rotation and translation. The functions f () refer to the model functions and dr = (x, y, h , b ) T.
(a) Compute dr from Rotation. Let
~ f x h + f x ( h ) ~ f + f y ( h ) dr[rotation] = y h ~ . 0

x , y
(m) 0.050 0.270 0.096 0.309 0.103 0.042

(deg) 4.0 9.7 5.7 9.0 3.3 3.0

(m) 0.019 0.045 0.039 0.096 0.067 0.072

(deg) 0.1 1.7 1.1 0.0 3.3 2.2

2 3 4 5 6

( (

) )


Proceedings of the International Conference on Robotics, Vision, Information and Signal Processing ROVISP2007

(a) Net Position Error

(b) Heading Error

Figure 6- Comparison of Pose Error from Odometry and Ground Truth

Figure 6 shows barcharts of the position and heading error respectively between raw odometry and after application of the proposed error model. Error bars are to 3-sigma standard deviation A complete odometry error model addressing both rotational and translational systematic errors was presented. Results from real experiments show a significant reduction in the heading and position error, especially where the systematic error is greatest.

The path curvature of a synchronous drive robot during translational motion is unexpected. Kinematic modelling has shown that this behaviour can be explained as being a systematic rather than a random, non-systematic error, which at first sight may have been assumed.
Rotational Motions. The sinusoidal function for the spot rotation can be explained the vertical rotation axis of the wheels being slightly offset from the contact point of the wheel on the ground causing a cam effect. However, this displacement is localised and does not accumulate. It is relatively small in comparison to errors from translational motion (especially at wheel orientations of greatest path curvature). Translational Motions. The results show that the most dramatic reduction in heading error is where the systematic error is greatest, with little or no improvement when the trajectory of the robot is relatively straight. The results suggest that the overall net heading and position error can be minimised by conducted translational motions at wheel orientations where the curvature is greatest.

M. Zaman gratefully acknowledges the support of the UK Engineering and Physical Sciences Research Council (EPSRC).

[1] Borenstein J., and Feng L. 1996. Measurement and correction of systematic odometry errors in mobile robots, IEEE Transactions Robotics and Automation, vol. 12, pp. 869880. [2] Bak M., Larsen T., Andersen N. A., and Ravn O. 1999. Auto-calibration of systematic odometry errors in mobile robots, Proceedings of SPIE Conference Mobile Robotics XIV, vol. 3838, pp. 252263. [3] Martinelli A. 2001. A possible strategy to evaluate the odometry of a mobile robot, Proceedings of IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 19461951. [4] Martinelli A. 2002. The odometry error of a mobile robot with a synchronous drive system, IEEE Transactions on Robotics and Automation, pp. 399405. [5] Doh N. L., Choset H., and Chung W. K. 2003. Accurate relative localization using odometry, Proceedings of IEEE International Conference on Robotics and Automation, pp. 16061612. [6] Kelly A. 2001. General solution for linearised systematic error propagation, Proceedings of International

This can be explained by the sinusoidal nature of the functions fb and fc where the gradient of the function is greatest when the path curvature is the least (ie., around the region where the function argument crosses the x-axis). The function is sensitive to small errors around this region and is less able to correct for systematic odometry errors.


Proceedings of the International Conference on Robotics, Vision, Information and Signal Processing ROVISP2007

conference on Intelligent Robots and Systems (IROS 2001), pp. 19381945.

[7] Zaman M. 2001. Mobile robot localisation, MPhil to PhD Transfer Report, University of Surrey, UK.