Você está na página 1de 5

978-1-4577-1524-2/11/$26.

00 2011 IEEE


$EVWUDFW-Automation of agricultural work has been
expected to solve several problems of 1apan`s agriculture and
help farmers. Now it is becoming a key to keep sustainability of
national agricultural production. Thus we are now trying to
develop a robot combine harvester. Attempts have been made to
develop robot combine harvester. Although a few prototypes
can run and automatically harvest rice and wheat, unloading
work, in which grains are unloaded through harvester`s auger
into a grain container on a truck, has not yet been automated. In
this paper, a method using machine vision is presented to
position the harvester`s spout at an appropriate point.
Experimental results show that this method has sufficient
accuracy.

I. INTRODUCTION
ESEARCH and development oI agricultural machinery
in Japan have contributed to the reduction oI working
hours and have constructed mechanized systems in
agricultural production, especially in Iield and rice paddy |1|.
Its main purpose has been to make agricultural work more
brieI and light.
As Ior today`s circumstances oI Japan`s agriculture, less
and less workers are engaged in agriculture and they are aging
rapidly |2|. Moreover, opening up the country to trade is
being inevitable year by year. Japan`s government is
considering joining talks on the Trans-PaciIic Partnership, a
huge Iree-trade area around PaciIic Rim. ThereIore, it is
thought that Japan`s agriculture cannot help Iacing a labor
shortage and will be exposed to harsher global competition.
To keep the sustainability oI Japan`s agriculture under such
diIIicult circumstances, needed strategies are as Iollows: 1.
accumulation oI Iarmland, 2. development oI automated

Manuscript received September 22, 2011. This work was supported the
Grant-in-Aid Ior ScientiIic Research 22380141
H. Kurita is with the Division oI Environmental Science and Technology,
Kyoto University, Kyoto, Japan (corresponding author to provide phone:
81-75-753-6317; Iax: 81-75-753-6167; e-mail:
kuritaelam.kais.kyoto-u.ac.jp)
M. Iida is with the Division oI Environmental Science and Technology,
Kyoto University, Kyoto, Japan (e-mail: iidaelam.kais.kyoto-u.ac.jp).
M. Suguri is with the Division oI Environmental Science and Technology,
Kyoto University, Kyoto, Japan (e-mail: sugurielam.kais.kyoto-u.ac.jp)
R. Uchida is with the Division oI Environmental Science and Technology,
Kyoto University, Kyoto, Japan (e-mail: uchidaelam.kais.kyoto-u.ac.jp)
H. Zhu is with the Division oI Environmental Science and Technology,
Kyoto University, Kyoto, Japan (e-mail:
zhu.huapingelam.kais.kyoto-u.ac.jp)
R. Masuda is with the Division oI Environmental Science and Technology,
Kyoto University, Kyoto, Japan (e-mail: masudaelam.kais.kyoto-u.ac.jp)
agricultural system, and 3. cultivation oI new Iarmers. It is
necessary Ior more competitive agriculture to accumulate
agricultural land and cultivate the Iield with automated
machinery. This kind oI agriculture will allow us to manage
larger Iield on less workIorce and thereby to make agriculture
more proIitable and less exhausting, in other words, attractive
industry. ThereIore recently in Japan the development oI
robotic agricultural machinery has become much more oI
importance than ever |3|-|5|.
Additionally, it is signiIicant and meaningIul to automatize
eIIicient ways in agricultural work which sometimes rely on
Iarmer`s skill and experience over the years, because it makes
the entry into agriculture easier Ior potential and young
Iarmers.
For these reasons we are now trying to develop a robot
combine harvester, an automated head-Ieeding combine
harvester. A prototype oI robot head-Ieeding combine
harvester is also being developed in the National Agricultural
Research Center (Ibaraki, Japan) |6|; however, automation oI
unloading process has not been studied.
When the grain tank in a combine harvester is Iull, grains
are unloaded into the container attached to a pickup truck and
conveyed to local rice processing Iacilities. Generally two
workers are engaged in this process; one as an operator oI the
harvester and another as a driver oI the truck. The operator
controls its auger and unloads grains, and then he goes on
harvesting, while the driver transports them to rice processing
Iacilities usually not so close Irom the paddy. II this process is
perIormed by one worker, he must interrupt harvesting during
the transportation. A robot combine harvester could make this
work to be done by one worker alone and would not take any
more time than that required Ior the work by two workers.
In addition, the operator`s skill aIIects how Iast he can
position the auger at an appropriate point to unload. ThereIore
the unloading automation is an important Iunction not only
Ior the development oI robot combine harvester but also Ior
the reduction oI agricultural workload and the proper
positioning independent Irom operator`s skill and experience.
The problem lies in how to Iind the container (or the truck)
and how to position the auger at an appropriate point. We try
to solve the challenges using image processing. This is
because commercialized head-Ieeding harvesters are
equipped with a color camera and almost all the middle sized
harvesters are controlled by microcomputers, so that the
automation by image processing needs little hardware
Application of Image Processing Technology for Unloading
Automation of Robot Combine Harvester
Hiroki KURITA, Michihisa IIDA, Masahiko SUGURI,
Ryo UCHIDA, Huaping ZHU and Ryohei MASUDA
R
- 36 - SI International 2011

modiIication; hence lower cost than it would be by other
ways.
In the Iields oI computer science such as image based
sensing, virtual reality (VR), etc., augmented reality (AR) is
recently attracting an increasing attention |7|. VR is a
technology which constructs a virtual world using computer
graphics, while AR is an expansion oI VR, which overlays
virtual objects onto real images captured by camera. Those
who watch the monitor projecting such overlaid images
observe the VR-supplemented world |8|.
To realize AR applications the greatest challenge is the
precise registration oI virtual images with the real image, and
at the same time the suIIiciently Iast image processing is also
essential Ior the continuous and natural perception oI AR
world |9|. ARToolKit was developed by Kato et al. to support
creating AR applications, which has suIIicient accuracy and
speed. It uses a square board as a marker, detects it, and
calculates the relation between a camera and the marker Ior
rendering virtual images |9|-|11|. This Iunction can be
utilized Ior the Iast and accurate detection oI grain container
and the proper positioning oI spout.
The objectives oI this paper are Iirstly to present a method
Ior automated positioning oI the auger using image
processing, secondarily to measure the accuracy oI
positioning, and Iinally to study the eIIectiveness and the
prospect oI this method.

II. MATERIALS AND METHODS
A. Experiment Devices and Development Environment
Experiment devices are as Iollows:
(1) Head-Ieeding combine harvester, VY50 CLAM
(Mitsubishi Agricultural Machinery Co., Ltd, Shimane,
Japan)
(2) USB camera, UCAM-DLA200H (ELECOM Co., Ltd.
Osaka, Japan)
(3) Total station, SET4100s (Sokkia Co., Ltd. Tokyo,
Japan)
(4) Marker.
Figure 1, 2, and 3 show the harvester and truck, the total
station, and the marker, respectively.


Fig. 1. Combine harvester and truck with container.

Fig. 2. Total station.


Fig. 3. Marker (arrows indicate 400 mm 400 mm).

SoItware used in this study is developed and run under
Iollowing environment.
(1) Visual C 2008 (MicrosoIt)
(2) OpenCV 2.1 (Intel)
(3) OpenGL 4.1 (Silicon Graphics)
(4) PC (AMD Turion 64 MT-32 1.58GHz, 1.43 GB RAM)

B. ARToolKit
ARToolKit is a computer program library Ior C/C and
developed by Kato et al. to create AR applications easily. It
detects markers and gives homogeneous transIorm matrices
Irom markers` coordinates to camera`s coordinate. Generally,
AR applications utilize these matrices and overlay various
kinds oI three-dimensional virtual objects onto real images.
Our method also utilizes ARToolKit to detect a marker and
to obtain the homogeneous transIorm matrix. It allows us to
know marker`s position and orientation, and then calculate
the optimal attitude oI unloading auger to position the spout at
an appropriate point.

C. Calculation of Attitude Angles
The unloading auger oI the harvester used in this study is
modeled as a two-degree-oI-Ireedom manipulator (see Figure
Unloading auger
Spout
- 37 - SI International 2011

4 and 5). Joint 1 rotates in -110 T
1
90 |degree|, however,
the unloading range in this study is limited in 0 T
1
90
|degree| because actual unloading work is perIormed around
this range. Joint 2 rotates in 0 T
1
45 |degree|.

Fig. 4. Joint 1 and the range oI T
1
.

Fig. 5. Joint 2 and the range oI T
2
.

Figure 6 indicates the link structure and coordinate systems.
Link parameters are shown in Table I.


Fig. 6. Link structure oI the auger.
Table I. Link parameters oI the auger.
Link
Length
|mm|
Twist angle
|degree|
Distance
|mm|
Attitude
angle
|degree|
1 0 0 0 T
1

2 0 90 O
D
O
G
T
2


In the Iollowing description, let 6
n
be an orthogonal system
generated by three axes, X
n
, Y
n
, Z
n
. In addition, let us call a
homogeneous transIorm matrix oI 6
n
to6
m
as
m
7
n
. Then
0
7
1
,
1
7
2
,
2
7
Camera
are described as Iollows:




(1)



(2)


(3)


Camera
7
Marker
is obtained by a Iunction oI ARToolKit,
thereIore
0
7
Marker
can be derived as,


(4)

II we abbreviate the vector oI the target point on 6
Marker
as
Marker
U
Target
, then
0
U
Target
is described with
0
7
Marker
as,



(5)

, where
0
7
1
and
1
7
2
are given by T
1
and T
2
corresponding to
the present auger`s position.
2
7
Camera
is determined by the
camera`s clamping angle D.
0
U
AugerTop
, the positional vector oI auger top on 6
0
, is written
with
2
U
AugerTop
, on 6
2
, as Iollows:


(6)

Finally, the geometric relation mentioned above can be
illustrated as Figure 7.
Taking into account the Iact that

is expressed
in (7),

(7)

(6) gives

(8)
Y
0

X
0
DeIault
position
Unloading range
Restricted range
T
1
Joint 1


Z
0
Y
0

Maximum angle
|degree|
Joint 1
Joint 2
O
D
O
F

O
E

Spout
O
G

Camera
Z
0
, Z
1

Y
0
, Y
1

X
0
, X
1

Y
2
X
2

Z
2

T
1
T
2

O
H

- 38 - SI International 2011


Fig. 7. The geometric relation between the auger top,
the target point, and marker.

II we denote the vector

by

,
which is indeed calculated according to the equation (5), then
T
1
and T
2
are determined as Iollows:

(9)


(10)

, which are the attitude angles we need.

D. Prerequisite and Procedure of Experiment
Our experiment needs some assumptions. Firstly suppose
that the harvester parks in a paddy alongside the truck that is
located on the Iarm road along the paddy, secondary that the
marker on the rooI oI the truck lies within the purview oI the
camera attached to the auger, and that the spout can
physically reach to the grain container.
SoItware is developed Ior the detection oI marker and the
calculation oI attitude angles oI the auger. The experiment is
perIormed as Iollows:
1) Start the soItware, the auger being at the state in which
the camera can catch the whole marker.
2) Input the attitude angles into our combine control
program and control the auger.
3) Measure the position oI the spout and the marker with
the total station.
1) 3) is repeated Iive times.
Four relative positions between the harvester and the
pickup truck are tested. For each position, 1) - 3) is repeated.

E. Accuracv Specification
It is most important, in unloading work, not to make any
loss. Figure 8 shows the dimensions oI the grain container and
the width oI grain stream at container level. Note that the
dimensions oI pickup truck and that oI grain container is
usually almost the same.

Fig. 8. The dimensions oI the container
and the width oI grain stream.

To unload grains without any loss, there needs to be
suIIicient room between grain stream and both side edges oI
the container. Judging Irom actual unloading work, about 0.3
m oI room adequately meets our necessity (no loss). The
width oI grain stream is about 0.35 m at a maximum.
ThereIore the spout should be positioned in the range oI about
0.45 m-square. Then we set + 0.2 m (i.e. 0.4 m-square) as the
acceptable error Ior our auto-positioning system.
Figure 9 illustrates the top-down view oI the marker and
the grain container. We set target point at (x, v) (1.5, 0) on
the coordinate system shown in the Iigure.

Fig. 9. Marker and grain container.

III. RESULTS AND DISCUSSION
Figure 10 shows the results oI the experiment Ior all trials.
Each value indicates the position oI spout aIter positioning.
Values are measured by the total station and represented on
the coordinate system shown in Fig. 9.
A square-shaped point denotes the target point (1.5, 0) that
is set in the soItware. Dashed-line-square represents the
acceptable error range, (1.5 + 0.2, 0 + 0.2). Figure 9 shows
that the spout is positioned inside the range Ior all trials. Some
trials result in the same measured value. Table II shows the
maximum error, the mean error, and the root-mean-square
error (RMSE) Irom the true value (1.5, 0).
Marker
U
Target

Target Point
0
U
Target

6
Marker

6

0
U
Spout
Spout
0
7
Marker

Loading
PlatIorm
Target point
x
v
Grain container
Marker
1.4 m
2 m
1.4 m
2 m
more than 0.3 m needed
0.35 m
(at maximum)
- 39 - SI International 2011


Fig. 10. Results oI the experiment.

Table II. Errors Irom the true value.
[ |m| y |m|
Max 0.180 0.168
Mean 0.082 0.081
RMSE 0.101 0.098

RMSE is about 0.1 m in both [ and \ direction. The
maximum error (0.180, 0.168) is also less than 0.2 m (the
acceptable error range) in both directions.
Main possible sources oI these errors may be 1) the resolution
oI camera, 2) accuracy oI positioning, 3) Ilexure oI marker,
and 4) measurement precision oI camera`s clamping angle
and the lengths oI link parameters. Especially 1) is thought to
be critical. Figure 11 shows the accuracy oI positioning which
Iigure shows the diIIerence between the spout calculated by
input angles and that calculated by output angles.

Fig. 11. The diIIerence between the spout position calculated
by input angles and that calculated by output angles.

Although 3) and 4) can be some sources oI errors, they are
not so important compared to 1), because we have tried two
types oI marker, a plywood marker (relatively Ilexible) and
an aluminum board (relatively Ilat), both oI which did not
make so much diIIerence. We have measured camera`s
clumping angle and the length oI link parameters several
times; however, their values are almost the same.
Additionally these measurement precision are Iew
centimeters and Iew degrees, which geometrically result in
only a Iew centimeters error at the spout. ThereIore
high-resolution camera can improve the accuracy oI
positioning.
IV. CONCLUSION
In this study, a new method is presented Ior detecting a
grain container and position the spout oI harvester at an
appropriate point. SoItware is developed Ior this purpose
which detects a marker Irom images obtained with USB
camera attached to the auger. The auger is modeled as a
manipulator oI two-degree-oI-Ireedom. The soItware also
calculates the proper attitude angles Ior positioning using the
images. Experiment is perIormed in which the attitude angles
are determined with this method and the coordinate oI the
spout are measured with a total station aIter positioning.
Results show that the method can position the auger with
suIIicient precision. RMSE is about 0.1 m and the maximum
error Iall inside 0.2 m in both x and v direction. These values
IulIill the accuracy speciIication and thereby there is
suIIicient room between grain stream and the edges oI
container.
Main source oI error is thought to be resolution oI camera.
ThereIore positioning oI the auger could be more accurate
with a higher-resolution camera.

REFERENCES
|1| K. Tamaki, 'Agriculture and Robot From the Past to the Future-,
(in Japanese), Jour. Agri. Sci., Tokvo Univ. of Agric., 2006, vol. 50, no.
4, pp. 83-94.
|2| Ministry oI Agriculture, Forestry and Fisheries, (2010, June, 11).
'FY2009 Annual Report on Food, Agriculture and Rural Areas in
Japan |Online|. Available:
http://www.maII.go.jp/e/annualreport/2009/index.html
|3| N. Noguchi and H. Terao, 'Path planning oI an agricultural mobile
robot by neural network and genetic algorithm, Computer Electron
Agric, 1997, vol. 18 (2-3), 187-204
|4| K. Imou et al. 'Autonomous tractor Ior Iorage production, Proc. 13
th

International Congress on Agricultural Engineering, 1998, vol. 3, pp.
361-368.
|5| Y. Nagasaka, N. Umeda, Y. Kanetai, K. Taniwaki, and Y. Sasaki,
'Automated rice transplanter using global positioning and gyroscopes,
Computer Electron Agric, 2004, vol. 43 (3), pp. 223-234.
|6| J. Sato, K. Shigeta, and Y. Nagasaka, 'Automatic operation oI a
combined harvester in a rice Iield, Proc. 1996 IEEE/SICE/RSJ
International ConIerence on Multisensor Fusion and Integration Ior
Intelligent Systems, 1996, pp. 86-92.
|7| T. Sato and N. Yokoya, 'Image based sensing Ior VR/MR, (in
Japanese), Journal oI the Society oI Instrument and Control Engineers,
2008, vol. 47, pp. 30-35.
|8| M. Billinghurst, H. Kato, and I. Poupyrev, 'The magicbook: a
transitional AR interIace, Computers & Graphics, 2001, vol. 25, pp.
745-753.
|9| H. Kato and M. Billinghurst, 'Marker tracking and HMD calibration
Ior a video-based augmented reality conIerencing system, Proc. 2
nd

IEEE and ACM International Workshop on Augmented Reality, 1999,
pp. 85-94.
|10| H. Kato, M. Billinghurst, I. Poupyrev, K. Imamoto, and K. Tachibana,
'Virtual object manipulation on a table-top AR environment, Proc.
International Symposium on Augmented Reality, 2000, pp. 111-119.
|11| F. Farbiz et al. 'Live three-dimensional content Ior augmented reality,
IEEE Trans. Multimedia., 2005, vol. 7, no. 3, pp. 514-523.
|12| T. Yoshikawa, 'Foundations oI robot control, (in Japanese), Tokyo,
Corona Publishing, 1988, pp. 1-46.


0.3
0.2
0.l
0
0.l
0.2
0.3
l l.2 l.4 l.6 l.8 2
measured value target point
|m|
|m|
0.06
0.03
0
0.03
0.06
0.06 0.03 0 0.03 0.06
|m|
|m|
- 40 - SI International 2011

Você também pode gostar