Escolar Documentos
Profissional Documentos
Cultura Documentos
D DE EC CL LA AR RA AT TI IO ON N O OF F T TH HE ES SI IS S/ / U UN ND DE ER RG GR RA AD DU UA AT TE E P PR RO OJ J E EC CT T P PA AP PE ER R A AN ND D C CO OP PY YR RI IG GH HT T
Authors full name : CHEONG WAI LEONG
Date of birth : 19 JULY 1986
Title : RFID BASED INDOOR NAVIGATION ASSISTANCE ROBOT
FOR THE BLIND AND VISUALLY IMPAIRED
Academic Session : 2009/2010
I declare that this thesis is classified as:
CONFIDENTIAL
(Contains confidential information under the
Official Secret Act 1972)*
RESTRICTED
(Contains restricted information as specified by
the organization where research was done) *
OPEN ACCESS
I agree that my thesis to be published as online
open access (full text)
I acknowledged that Universiti Teknologi Malaysia reserves the right as follows :
1. The thesis is the property of Universiti Teknologi Malaysia.
2. The Library of Universiti Teknologi Malaysia has the right to make copies
for the purpose of research only.
3. The Library has the right to make copies of the thesis for academic
exchange.
..
Certified by :
..
SIGNATURE
SIGNATURE OF SUPERVISOR
860719-38-5889
(NEW IC NO. PASSPORTNO.)
DR. SALINDA BUYAMIN
NAME OF SUPERVISOR
Date: 04 MAY 2010 Date: 04 MAY 2010
UNIVERSITI TEKNOLOGI MALAYSIA
I hereby declare that I have read this thesis and in my opinion this thesis is
sufficient in terms of scope and quality for the award of the degree of Bachelor of
Electrical Engineering (Mechatronics).
Signature : __________________________
Name of Supervisor : __________________________
Date : __________________________
I declare that this thesis entitled RFID Based Indoor Navigation Assistance Robot
for the Blind and Visually Impaired is the result of my own research except as cited
in the references. The thesis has not been accepted for any degree and is not
concurrently submitted in candidature of any other degree.
Signature : __________________________
Name of Author : __________________________
Date : __________________________
CHEONG WAI LEONG
04 MAY 2010
iii
Special dedicated to
My beloved family and friends
iv
ACKNOWLEDGEMENT
I would like to take this opportunity to express my deepest gratitude to my
supervisor, Dr. Salinda Buyamin who had been guided me along the project. She has
been motivating and inspired me to successfully complete my project. Her guidance,
advice, encouragement, patient and support given throughout the project are greatly
appreciated.
Sincere thanks also to my lecturers who had given me valuable suggestions
and helpful discussions to ensure successfulness of the project.
My appreciations also extend to my parents and friends for their caring and
support. Last but not least, I am thankful for those who directly or indirectly lent me
a hand in this project.
v
ABSTRACT
Blind and visually impaired is a group of people who lose their eye sight.
Consequently, there is a significant impact on their mobility directly. This paper
proposed an efficient algorithm for mobile robot to navigate in indoor environment
by using RFID system. Service robot with using this algorithm is able to aid this
group of people in their indoor navigation. There is one RFID reader installed at the
bottom chassis of the mobile robot to sense for passive RFID tag embedded under
carpet on the floor. Each tag will retrieve a unique identification number when
interrogated by appropriate frequency. By using the identification number, a
database for coordinate positions of each tag on the map is established. Therefore,
the position of the robot can be known in certainty in such environment. Besides,
orientation and direction to reach goal location can be obtained by using three
coordinate positions and trigonometry functions. The algorithm presents in this
paper categorized different movements into four categories so that direction and
turning angle can be obtained accurately. As a result, the robot is able to navigate to
goal location with smooth and shorter path. Another advantage, obstacle avoidance
is integrated into the navigation algorithm so that the robot can be implemented in a
dynamic environment. In addition to that, a stick with rotary base and elastic rubber
holder is built. Therefore, movements of user will leads to small impact on the robot
while navigating. Furthermore, different destination can be chosen by user through
the keypad attached. Sound indication is used to notify user on any dangerous
encountered by the robot along the path.
vi
ABSTRAK
Buta merupakan kejadian seseorang yang kehilangan penglihatan selama-
lamanya. Oleh sebab ini, pergerakan seseorang akan menjadi sukar. Dengan
berpandukan sistem RFID dan fungsi-fungsi trigonometri, robot khidmat dapat
bergerak secara efektif di dalam bangunan. Robot khidmat yang menggunakan cara
ini dapat membantu golongan buta untuk bergerak secara bebas di dalam bangunan.
Satu pembaca RFID akan dipasangkan di bahagian bawah robot untuk membaca tag-
tag RFID yang telah disusun di bawah permaidani. Setiap tag akan menghantar satu
set nombor yang unik kepada pembaca apabila terdedah kepada frekuensi yang
bersesuaian. Dengan menggunakan nombor-nombor unik ini, satu tapak data yang
menyimpan koordinat-koordinat bagi semua tag boleh disediakan. Jadi, kedudukan
dan juga orientasi robot dapat dihitungkan dengan meggunakan tiga koordinat dan
fungsi-fungsi trigonometri. Cara yang disampaikan akan mengasingkan pelbagai
pergerakan oleh robot kepada empat kumpulan. Dalam setiap kumpulan, terdapat
pengiraan sudut pusingan dan juga cara pemutusan arah yang tersendiri. Oleh sebab
itu, robot dapat bergerak secara lancar ke destinasi. Selain daripada itu, ciri
pengelakan halangan yang ditambah kepada robot menyebabkan robot khidmat ini
dapat digunakan dalam situasi dinamik. Satu tongkat yang boleh berputar di hujung
dipasang kepada robot ini. Di samping itu, pengguna boleh memegang pemegang
yang diperbuat daripada getah yang elastik. Dengan ini, pergerakan pengguna tidak
akan memberi impak yang besar kepada robot semasa bergerak. Pengguna boleh
menggunakan papan kunci yang tersedia untuk memilih lokasi yang ingin disampai.
Semasa bergerak, pelbagai bunyi akan dikeluarkan oleh robot untuk memberitahu
keadaan persekitaran kepada pengguna.
vii
TABLE OF CONTENTS
CHAPTER TITLE
PAGE
AUTHORS DECLARATION ii
DEDICATION iii
ACKNOWLEDGEMENT iv
ABSTRACT v
ABSTRAK vi
TABLE OF CONTENTS
LIST OF TABLES
vii
ix
LIST OF FIGURES x
LIST OF SYMBOL AND ABBREVIATIONS xiii
LIST OF APPENDICES xiv
1 INTRODUCTION
1.1 Background of Study 1
1.2 Statement of Problem 3
1.3 Research Objectives 5
1.4 Significance of Study 5
1.5 Scope of Study 5
2 LITERATURE REVIEW
2.1 The Guide Cane 6
2.2 RoboCart 7
2.3 Walking Guide Robot 9
2.4 Guide Dog 10
2.5 UBIRO 11
viii
3 METHODOLOGY
3.1 Mechanical Design 13
3.2 Circuit Design and Equipments 16
3.2.1 System Overview 16
3.2.2 Indoor Navigation
3.2.3 Obstacles Avoidance
18
22
3.2.4 User Interfacing 22
3.3 Software Design 23
3.3.1 Determination of Quadrant of
Movement
25
3.3.2 Determination of Angle for Previous
Position to Current Position and
Current Position to Goal Position
3.3.3 Categorization
27
28
3.3.4 Special Case 29
3.3.5 Interrupt Routine for Tag Detected 31
4 RESULT AND DISCUSSION
4.1 Keypad and Robot Control 33
4.2 Obstacle and Sensor response 34
4.3 Experiments for Navigation 38
4.4 Results on Navigation 40
4.5 Summary of Experiment 45
5 CONCLUSIONS AND RECOMMENDATIONS
5.1 Conclusions 47
5.2 Limitations and Future Recommendations 48
REFERENCES 49
APPENDICES 51
ix
LIST OF TABLES
TABLE NO. TITLE
PAGE
3.1 Summary of categorization 28
4.1 Response of middle sensor with respect to
distance of obstacle
35
4.2 Response of right sensor with respect to
distance of obstacle
36
4.3 Response of left sensor with respect to
distance of obstacle
37
4.4 Response of buzzer 38
x
LIST OF FIGURES
FIGURE NO. TITLE
PAGE
2.1 The Guide Cane 7
2.2 RoboCart 8
2.3 Placement of RFID tags for RoboCart
localization
9
2.4 Prototype of Walking Guide robot 9
2.5 Tactile displays for the Walking Guide Robot 10
2.6 The Guide Dog 11
2.7 UBIRO 12
3.1 Mechanical design (overview) 13
3.2 Top view and dimensions 14
3.3 Front view and dimensions 14
3.4 Side view and dimensions 15
3.5 Stick with rotary base, rubber holder and
adjustable length
15
3.6 Overview of system 16
3.7 Power supply circuit 17
3.8 PIC18F452 circuit connections 18
xi
3.9 RFID technology 19
3.10 RFID reader and passive tags 20
3.11 Signal conditioning circuit for RFID reader 20
3.12 Stepper motor 21
3.13 Motor part 21
3.14 Analog Infrared sensors 22
3.15 Keypad and buzzers for user interfacing 23
3.16 Navigation Algorithm 24
3.17 Quadrant Distribution 25
3.18 Determination of quadrant 26
3.19 Angle calculation 27
3.20 Type 3 movement 30
3.21 Type 4 movement 30
3.22 Interrupt flow chart 31
4.1 Voltage response of middle sensor against
distance of obstacle
35
4.2 Voltage response of right sensor against
distance of obstacle
36
4.3 Voltage response of left sensor against distance
of obstacle
37
4.4 Low resolution map with 40 RFID tags used 39
4.5 High resolution map with 60 RFID tags used 40
4.6 Result for case 1 (low resolution map) 41
4.7 Result for case 1 (high resolution map) 41
4.8 Result for case 2 (low resolution map) 42
4.9 Result for case 2 (high resolution map) 42
xii
4.10 Result for case 3 (low resolution map) 43
4.11 Result for case 3 (high resolution map) 43
4.12 Result for case 4 (low resolution map) 44
4.13 Result for case 4 (high resolution map) 45
xiii
LIST OF SYMBOL AND ABBREVIATIONS
current
) and angle for
current position to goal position (
current
goal
) can be calculated by using
trigonometric functions.
Figure 3.19 Angle calculation
By referring to Figure 3.19, both angles can be obtained by using equations as
shown:
previous
current
= = tan
-1
[ (x
current
x
previous
) / (y
current
y
previous
)] (3.9)
current
goal
= = tan
-1
[ (x
goal
x
current
) / (y
goal
y
current
)] (3.10)
When meet certain condition where denominator eg. (y
current
y
previous
) or
(y
goal
y
current
) is zero, we may assign 90 to the result.
28
3.3.3 Categorization
By examine four conditions which are Q
P
, Q
N
,
previous
current
and
current
goal
,
all possible angles turning can be obtained. Table 3.1 shows the result of grouping
which having different direction and turning angle.
Table 3.1: Summary of categorization
Type 1 Type 2 Type 3* Type 4*
Q
P
Q
N
Q
P
Q
N
Q
P
Q
N
Q
P
Q
N
1
st
1
st
1
st
4
th
2
nd
4
th
2
nd
4
th
2
nd
2
nd
2
nd
3
rd
4
th
2
nd
4
th
2
nd
3
rd
3
rd
3
rd
2
nd
1
st
3
rd
1
st
3
rd
4
th
4
th
4
th
1
st
3
rd
1
st
3
rd
1
st
1
st
2
nd
3
rd
4
th
2
nd
1
st
4
th
3
rd
Each types having its own direction determination method as well as equation
to obtain turning angle,
rotation
. The * sign means special case.
Type 1:
Q
P
> Q
N
right
Q
P
< Q
N
left
Q
P
= Q
N
(i)
current
goal
>
previous
current
right
(ii)
current
goal
<
previous
current
left
rotation
= absolute (
current
goal
-
previous
current
) (3.11)
29
Type 2:
Q
P
= 1
st
, Q
N
= 4
th
right
Q
P
= 3
rd
, Q
N
= 2
nd
right
Q
P
= 4
th
, Q
N
= 1
st
left
Q
P
= 2
nd
, Q
N
= 3
rd
left
rotation
= 180 - [absolute(
current
goal
) + absolute(
previous
current
)] (3.12)
Type 3*:
Q
P
and Q
N
are even right
Q
P
and Q
N
are odd left
rotation
= 180 - [absolute(
current
goal
) - absolute(
previous
current
)] (3.13)
Type 4*:
Q
P
and Q
N
are even left
Q
P
and Q
N
are odd right
rotation
= 180 - [absolute(
previous
current
) - absolute(
current
goal
)] (3.14)
3.3.4 Special Case
It can be seen that Type 3 and Type 4 having the same component for Q
P
and
Q
N
. This can be explained through the situation shown in Figure 3.20 and Figure
3.21 where the movements are in opposite direction although having the same value
for Q
P
and Q
N
.
30
Figure 3.20 Type 3 movement
Figure 3.21 Type 4 movement
In both cases, it can be noted that Q
P
is 1
st
quadrant while Q
N
is 3
rd
quadrant.
However, the direction of turning for both condition are different.
In Figure 3.20,
current
goal
>
previous
current
rotation
= 180 - [absolute(
current
goal
) - absolute(
previous
current
)] (3.15)
Direction taken left
Where
current
goal
= and
previous
current
=
31
In Figure 3.21,
previous
current
>
current
goal
rotation
= 180 - [absolute(
previous
current
) - absolute(
current
goal
)] (3.16)
Direction taken right
Where
current
goal
= and
previous
current
=
As a result, we shall determine which condition stated below is fulfilled
before we categorize such movement in which category:
current
goal
>
previous
current
Type 3
previous
current
>
current
goal
Type 4
3.3.5 Interrupt Routine for Tag Detected
Figure 3.22 Interrupt flow chart
32
In order to make the robot is able to keep updating its current position in the
environment; an interrupt routine as shown in Figure 3.22 will be utilized. In the
routine, the identification number retrieve from tag will be used to find the
corresponding coordination position from database. Then, the retrieved coordination
position will be compared with the current position. If that is a new position, the
algorithm will update its previous and current position. As a result, the robot will be
able to keep track of its position although performing obstacle avoidance and
immediately trace for the new path to reach the destination after avoid obstacle.
CHAPTER 4
RESULT AND DISCUSSION
4.1 Keypad and Robot Control
In order to interface with user, the 4x4 matrix keypad is used. After power up
the robot, user may enter key A for mode A execution. Besides, user may enter key
B for mode B instead of mode A. In mode A, the robot is able to navigate to three
predefined destinations. User is required to key-in the number of destination to
travel. In mode B, the user may enter the xy coordinate of the destination wishes to
navigate. Below are the steps to control the robot in mode A and mode B.
Robot control in mode A:
1. Three destinations can be chosen by the user. User may enter key 1,
key 2 or key 3.
2. Key # must be pressed to enter the desired location number so that
the robot will start navigate.
3. After the robot reaches the destination, a long beep sound from
buzzers will be noticed. User may repeat step 1 and step 2 for next
location. User may enter key D for mode reset so that user can
switch between mode A and mode B
34
Robot control in mode B:
1. User may enter the desired location by key-in the xy-coordinate of the
destination. First, user is needed to enter x-coordinate.
2. Key # must be pressed to confirm the x-coordinate.
3. User may enter the y-coordinate of the desired destination.
4. Key # should be pressed so that the robot acquired the destination
and start to navigate.
5. Step 1 to step 4 can be repeated if the user wish to continue execute in
this mode. Otherwise, key D can be pressed for mode reset.
4.2 Obstacle and Sensor response
Three analog IR sensors used in this project playing an important role in
obstacle avoidance. The sensors will produce an increasing voltage as obstacle
approaching towards it. Therefore, output voltage corresponding to a particular
distance of obstacle for each sensor is gathered so that distance of the obstacle can be
known. Table 4.1, 4.2, 4.3 shows the response of middle, right and left sensor with
respect to distance of obstacle respectively. Besides, the average reading from each
sensor is plotted in Figure 4.1, 4.2 and 4.3 respectively.
35
Table 4.1: Response of middle sensor with respect to distance of obstacle
Distance
(cm)
Reading 1
(V)
Reading 2
(V)
Reading 3
(V)
Average Reading
(V)
>60 0.31 0.28 0.37 0.32
60 0.49 0.5 0.52 0.50
55 0.53 0.54 0.55 0.54
50 0.55 0.58 0.59 0.57
45 0.58 0.61 0.66 0.62
40 0.64 0.67 0.7 0.67
35 0.73 0.77 0.77 0.76
30 0.84 0.88 0.87 0.86
25 0.98 1.03 0.94 0.98
20 1.22 1.25 1.12 1.20
15 1.57 1.5 1.36 1.48
10 1.8 1.71 1.6 1.70
5 1.94 1.79 1.6 1.78
Figure 4.1 Voltage response of middle sensor against distance of obstacle
0.00
0.20
0.40
0.60
0.80
1.00
1.20
1.40
1.60
1.80
2.00
>60 60 55 50 45 40 35 30 25 20 15 10 5
V
o
l
t
a
g
e
(
V
)
Distance (cm)
Voltage of Middle Sensor (V) vs Obstacle Distance
(cm)
Average Reading (V)
36
Table 4.2: Response of right sensor with respect to distance of obstacle
Distance
(cm)
Reading 1
(V)
Reading 2
(V)
Reading 3
(V)
Average Reading
(V)
>60 0.46 0.25 0.29 0.33
60 0.53 0.55 0.56 0.55
55 0.56 0.59 0.59 0.58
50 0.59 0.63 0.63 0.62
45 0.65 0.68 0.69 0.67
40 0.71 0.74 0.76 0.74
35 0.76 0.81 0.84 0.80
30 0.86 0.91 0.94 0.90
25 1.01 1.06 1.02 1.03
20 1.25 1.27 1.25 1.26
15 1.59 1.6 1.56 1.58
10 1.93 1.94 1.71 1.86
5 1.94 1.94 1.82 1.90
Figure 4.2 Voltage response of right sensor against distance of obstacle
0.00
0.20
0.40
0.60
0.80
1.00
1.20
1.40
1.60
1.80
2.00
>60 60 55 50 45 40 35 30 25 20 15 10 5
V
o
l
t
a
g
e
(
V
)
Distance (cm)
Voltage of Right Sensor (V) vs Obstacle Distance
(cm)
Average Reading (V)
37
Table 4.3: Response of left sensor with respect to distance of obstacle
Distance
(cm)
Reading 1
(V)
Reading 2
(V)
Reading 3
(V)
Average Reading
(V)
>60 0.18 0.19 0.24 0.20
60 0.53 0.48 0.51 0.51
55 0.56 0.53 0.54 0.54
50 0.58 0.58 0.58 0.58
45 0.65 0.63 0.62 0.63
40 0.71 0.67 0.69 0.69
35 0.8 0.73 0.77 0.77
30 0.9 0.87 0.86 0.88
25 1 1.02 0.98 1.00
20 1.19 1.14 1.14 1.16
15 1.41 1.37 1.38 1.39
10 1.57 1.57 1.58 1.57
5 1.57 1.57 1.58 1.57
Figure 4.3 Voltage response of left sensor against distance of obstacle
By referring to Figure 4.1, 4.2 and 4.3, it can be noted that the response of
each sensor to a particular obstacle distance may not the same. This is due to the
0.00
0.20
0.40
0.60
0.80
1.00
1.20
1.40
1.60
1.80
2.00
>60 60 55 50 45 40 35 30 25 20 15 10 5
V
o
l
t
a
g
e
(
V
)
Distance (cm)
Voltage of Left Sensor (V) vs Obstacle Distance (cm)
Average Reading (V)
38
characteristic of each sensor is not the same and results in some tolerance in their
output voltage. Besides, environment factors such as density of light will affect the
response of sensor. Therefore, a software calibration on each sensor is made so that
all sensors can work effectively under different environment.
Whenever there is obstacle detected in front of the robot, different types of
beep sound will be produced by both sides of buzzer according to the distance of the
obstacle. Table 4.4 shows the response of the buzzers to the obstacle.
Table 4.4: Response of buzzer
Distance from the obstacle (cm) Response of buzzer
35 25 Beep once
25 20 Beep twice
20 10 Beep three times and perform avoidance
<10 Beep four times and move backward
If there is obstacle detected at the side of robot, the particular buzzer will
generate a beep sound to notify user. For example, when there is obstacle at left side,
left buzzer will generate beep sound. After notify user that there is obstacle, the
robot will search for a path to pass through it. After determine a new path, buzzer
will beep again to notify user. For instance, if the robot will make a turn to right side,
right buzzer will produce a beep sound.
4.3 Experiments for Navigation
In order to test for the validity of the algorithm, two RFID based maps are
constructed as shown in Figure 4.4 and Figure 4.5. The map as shown in Figure 4.4
(low resolution map) is using 40 RFID tags to act as 2D coordinate position. All tags
are embedded under the carpet and separated to each other by 22.5 cm.
39
On the other hand, another map is built as shown in Figure 4.5 (high
resolution map). The purpose of the construction of this map is to obtain the result of
movement of the robot when implemented in a higher resolution of RFID based
environment. Therefore, 60 RFID tags are used for construction and every tag is
separated to each other by 15 cm. The 3 tags in yellow color indicated in both maps
are the destination which may choose by the user for navigation in this experiment.
Figure 4.4 Low resolution map with 40 RFID tags used
40
Figure 4.5 High resolution map with 60 RFID tags used
4.4 Results on Navigation
Case1:
The robot is required to navigate from start point to destination 1, destination 2, and
destination 3. There is no obstacle added in the map.
41
Figure 4.6 Result for case 1 (low resolution map)
Figure 4.7 Result for case1 (high resolution map)
From the result obtained as shown in Figure 4.6and Figure 4.7, it can be
noted that the robot is able to reach the entire destination. Referring to Figure 4.6,
the robot is able to make a correct angle turning with the proposed algorithm.
Therefore, the robot is able to take the shortest path to reach the target. Referring to
Figure 4.7, the path taken by the robot to reach the target is not smooth compare to
the result obtained in Figure 4.6. This is due to the detection range of the RFID
42
reader is approximately 5 cm. Therefore, the robot will detect other tag nearby along
its path and perform angle adjustment.
Case 2:
The robot is started in random orientations and needed to navigate to destination 2.
The map is clear of any obstacle.
Figure 4.8 Result for case 2 (low resolution map)
Figure 4.9 Result for case 2 (high resolution map)
43
Figure 4.8 and Figure 4.9 show that the robot is able to navigate to the
destination regardless to the starting pose. The path taken by the robot shown in
Figure 4.8 is smooth compared to the path taken in Figure 4.9, this is due to the range
of detection of the RFID reader as described in case 1.
Case 3:
In this case, there is one obstacle added into the map. The robot is commanded to
travel for destination 2.
Figure 4.10 Result for case 3 (low resolution map)
Figure 4.11 Result for case 3 (high resolution map)
44
From the result obtained from Figure 4.10 and Figure 4.11, the robot is able
to avoid any obstacle encountered in its path. After avoidance, it will trace for a new
path to reach the destination provided that there is tag detected. From observation,
the robot implemented in low resolution map will take longer distance to detect a tag
for position feedback due to the distance between tags is further. In some experiment,
the robot is ran out of the map due to no tag is detected after obstacle avoidance.
These phenomena become well when implement the robot in high resolution map.
Because of the more compact placement of the tags, the robot is able to detect a tag
fast after perform avoidance and execute new path finding.
Case 4:
The robot is commanded to navigate to destination with more obstacles added in the
map.
Figure 4.12 Result for case 3 (low resolution map)
45
Figure 4.13 Result for case 4 (high resolution map)
It can be seen that in Figure 4.12 and Figure 4.13, the robot is able to navigate
to the destination although there is more than 1 obstacle in the map. In low
resolution map, the robot becomes more difficult to trace for a new path after
perform avoidance. This is due to limited tag to act as feedback for the robot. The
difficulties is not face by the robot when using high resolution map since the robot
can easily detect a tag to act as position feedback.
4.5 Summary of Experiment
The result of experiments shows that the robot is able to make different
turning angle while navigating to the destination with the proposed algorithm under
different condition as long as there is tag detected to act as position and orientation
feedback. Besides, different condition on the map will make the robot take different
path to travel to the destination. Moreover, the robot will avoid any obstacle in its
path and a new path will be taken to reach the destination.
46
Since the robot will take the obstacle avoidance as the high priority than
reaching the goal location, the result shown above is only true under situation where
no obstacle is too close to the destination. Some experiments were failed when using
low resolution map due to the prototype map used is small and number of tag used is
limited. Consequently, the robot may run out of the map while performing obstacle
avoidance. However, this problem is solved easily when the robot is implemented in
a high resolution where more tags are utilized as the position and orientation
feedback to the robot. The drawback when using high resolution map is the RFID
reader may detect some unwanted tag which come closer to the robot when the robot
navigating. This is due to the reading range of the reader is approximately 5 cm.
Therefore, the path taken by the robot to the destination may not as smooth as the
path taken in low resolution map.
CHAPTER 5
CONCLUSIONS AND RECOMMENDATIONS
5.1 Conclusions
In conclusion, the robot is able to lead the user to various destinations as
commanded by the user. The user can select the destination by using the keypad
equipped on the robot.
As the result obtained from experiment, the robot is able to navigate to the
destination with the proposed algorithm by using RFID system and movements
categorization. Furthermore, the robot can be implemented in a dynamic
environment since the robot is able to gather the obstacle information in the
surroundings through the analog IR sensors mounted in front of the robot. Any
obstacle that detected by the robot will be displayed to the user by using beep sound
generated by the buzzers.
As a result, assistance robot for the blind and visually impaired may aid the
user to navigate in indoor environment more efficiently by using this algorithm.
48
5.2 Limitations and Future Recommendations
There are a few limitations encountered in developing the project. Firstly, the
use of the buzzers to indicate the user about the dangerous in the environment may
not a good solution. This is due to the user needed to further interpret the type of
beep sound generated to understand the information. Besides, the current draw by
the stepper motor is quite high which result in the battery is exhausted fast and
needed to recharge frequently. Other than that, a lot of heat is generated by the
power resistor that used in motor part result in high temperature. In certain condition,
the robot may make wide turn approximately 180 and face to the user. If the user
keeps move forward, he or she may kick on the robot.
Voice interface may be the best solution to interface with the user where it
can be used in place of the buzzers. By using voice interface, user may interpret the
message from robot easily. This may aid the movement of user especially when
performing obstacle avoidance. Besides, cooling system may be added to the motor
part since there is a lot of heat generated by the power resistor. Other than that, the
turning angle of the robot should be limited to less than 180 so that the robot will
take longer path to make wide turn. This will reduce the problem on crashing
between the user and robot.
49
REFERENCES
1. Tan, C.L., C.S. Tan, and Moghavvemi, Electronic Travel Aid for Visually
Impaired, in The AEESEAP International Conference 2005. 2005: KL,
Malaysia.
2. Ulrich, I. and J. Borenstein, The GuideCane-applying mobile robot
technologies to assist the visually impaired. Systems, Man and Cybernetics,
Part A: Systems and Humans, IEEE Transactions on, 2001. 31(2): p. 131-136.
3. Wong, H.L., Design of Direction Indicator of A Guide Dog for the Blind.
2002, University Technology Malaysia: Skudai,Johor.
4. Mau, S., Melchior, N.A., Makatchev, M. , Steinfeld, A., BlindAid: An
Electronic Aid for The Blind. 2008, Carnegie Mellon University, Robotics
Institute: Pittsburgh, PA.
5. Tesoriero, R., et al., Improving location awareness in indoor spaces using
RFID technology. Expert Systems with Applications. 37(1): p. 894-898.
6. HyungSoo, L., C. ByoungSuk, and L. JangMyung. An Efficient Localization
Algorithm for Mobile Robots based on RFID System. in SICE-ICASE, 2006.
International Joint Conference. 2006.
7. Sunhong, P. and S. Hashimoto, Autonomous Mobile Robot Navigation Using
Passive RFID in Indoor Environment. Industrial Electronics, IEEE
Transactions on, 2009. 56(7): p. 2366-2373.
50
8. Milella, A., et al. RFID-based environment mapping for autonomous mobile
robot applications. in Advanced intelligent mechatronics, 2007 ieee/asme
international conference on. 2007.
9. Gharpure, C.P. and V.A. Kulyukin, Robot-assisted shopping for the blind:
issues in spatial cognition and product selection. Intelligent Service Robotics,
2008. Volume 1(3): p. 15.
10. Bin, D., et al. The Research on Blind Navigation System Based on RFID. in
Wireless Communications, Networking and Mobile Computing, 2007. WiCom
2007. International Conference on. 2007.
11. Liu, J. and Y. Po. A Localization Algorithm for Mobile Robots in RFID
System. in Wireless Communications, Networking and Mobile Computing,
2007. WiCom 2007. International Conference on. 2007.
12. Electronic Travel Aids: New Directions for Research. 1986.
51
Appendix A
Schematic drawing of main board
52
Appendix B
Schematic drawing of motor
53
Appendix C
Source code
/******************Source code is under copyright*****************/
// For further information, please refer to Dr. Salinda Buyamin