Escolar Documentos
Profissional Documentos
Cultura Documentos
AbstractThis paper describes an algorithm to estimate realtime vehicle velocity using image processing technique from the
known camera calibration parameters. The presented algorithm
involves several main steps. First, the moving object is extracted by
utilizing frame differencing technique. Second, the object tracking
method is applied and the speed is estimated based on the
displacement of the objects centroid. Several assumptions are listed
to simplify the transformation of 2D images from 3D real-world
images. The results obtained from the experiment have been
compared to the estimated ground truth. From this experiment, it
exhibits that the proposed algorithm has achieved the velocity
accuracy estimation of about 1.7 km/h.
Keywordscamera calibration;
estimation; video image processing
object
tracking;
velocity
I. INTRODUCTION
Perlis,
Perlis,
II. ASSUMPTIONS
Assumption should be made since the images only
represent 2D coordinates which are actually the 3D
coordinates in the real world scene. In order to overcome this
problem, some assumptions are made as follows:
Malaysia.
772
(a)
(b)
(c)
(a)
(b)
Fig. 1 Video images from CCTV camera from upper left: (a)
captured event of image in colour (b) converted image in grey scale
(d)
m 1 n 1
u pq = xb p y b q : ( x b , y b ) b, b
xb = 0 yb = 0
x cb =
u10
u
, y cb = 01
u 00
u 00
(2)
(3)
(1)
773
Fig. 3 Image sequence from upper left: (a) object shape extraction for centroid computation (b) centroid is pointed by red mark in ROI
rectangular shape (c) speed computation as displayed in the image (d) continuous tracking with green box as current frame referred in
previous images
Reference (Car 4)
Upper
Reference (Car 4)
Reference (Car 4)
Lower
Measured (Car 4)
Reference (Car 5)
Upper
Reference (Car 5)
Reference (Car 5)
Lower
Measured (Car 5)
TABLE I
COMPARISON OF ESTIMATED GROUND TRUTH AND ESTIMATED RESULT
Frames
Reference (Car 1)
Upper
Reference (Car 1)
Reference (Car 1)
Lower
Measured (Car 1)
Reference (Car 2)
Upper
Reference (Car 2)
Reference (Car 2)
Lower
Measured (Car 2)
Reference (Car 3)
Upper
Reference (Car 3)
Reference (Car 3)
Lower
Measured (Car 3)
Start
frames (i)
End
frames (i)
Total
frames
Average
speed
(km/h)
35
105
70
16.9
105
175
70
22.5
175
245
70
21.8
105
175
70
23.1
1925
2015
90
16.2
2015
2110
95
16.0
2110
2200
90
17.7
2015
2110
95
16.5
2670
2745
75
11.5
2745
2825
80
13.4
2825
2905
80
15.1
2745
2825
80
19.4
4455
4515
60
22.0
4515
4575
60
23.8
4575
4635
60
19.6
4515
4575
60
23.8
4725
4795
70
19.9
4795
4870
75
25.1
4870
4940
70
16.4
4795
4870
75
23.5
TABLE II
ERROR MEASUREMENTS OF FIVE TRACKED CAR
Tracked vehicle
Car 1
Car 2
Car 3
Car 4
Car 5
Error (km/h)
0.6
0.5
6.0
0.0
-1.60
774
[9]
V. CONCLUSION
An algorithm to estimate a real-time velocity of a moving
vehicle using the known camera calibration parameters is
discussed this paper. The development of the proposed
algorithm involves several major steps; detecting moving
object using frame differencing technique, tracking the
detected moving object using an adapted point-based tracking
algorithm and computing the estimated velocity based on the
displacement of objects centroid. In order to simplify the
algorithm, several assumptions have been made as mentioned
in Section II.
Based on the experimental results, it demonstrates that the
proposed algorithm is capable of estimating the vehicle
velocity from a single uncalibrated camera. The velocity
estimation error has been evaluated by performing comparison
to the ground truth. The accuracy of about 1.7 km/h for
individual vehicle velocity has been accomplished. However,
the algorithm has not yet been validated with reference
system, such as GPS system. The accuracy of vehicle
estimation will further be obtained by comparing to the
reference system and other related work. Thus, a more
accurate uncertainty in velocity measurement for any moving
vehicle can be determined.
[10]
[11]
[12]
[13]
ACKNOWLEDGMENT
This research has been financially supported by National
Science Fund of Malaysia, Science Fund, under grant 01-0115-SF0095 and by Ministry of Invention, Science and
Technology (MOSTI). The authors would like to express their
highest gratitude to MOSTI.
REFERENCES
[1]
[2]
[3]
[4]
[5]
[6]
[7]
[8]
775