Escolar Documentos
Profissional Documentos
Cultura Documentos
opinion this thesis is sufficient in terms of scope and quality for the
award of the degree of Bachelor Of Engineering
(Electrical-Electronics).
Signature :
Name : Dr. Musa Bin Mohd Mokji
Date : 20 May 2011
i
May 2011
ii
I declare that this thesis entitled A Touch Screen Monitor Application for Cashier
is the result of my own research except as cited in the references. The thesis has not
been accepted for any degree and is not concurrently submitted in candidature of any
other degree.
Signature : .
Specially dedicated to my family, supervisor and friends who have been there for me
and inspired me along the way.
iv
ACKNOWLEDGEMENT
I would also like to thank my Multimedia Lecturer, Dr. Usman Ullah Sheikh
for the guidance and helps in this project.
Last but not least, I would like to extend my appreciation to all those who
have directly or indirectly played a part in the completion and the success of this
thesis.
v
ABSTRACT
This project is reporting a touch screen monitor application for cashier, which
is based on Windows Programming of touch screen via Microsoft Visual Basic and
Windows SDK. The purpose of this project is to program a touch screen monitor
which can provide the sales, management and stock control information for business
that can be used for cashier. At the same time, this project is to improve the cashier
system by using a touch screen monitor instead of keyboard and mouse to interface
with computer. As a result, the sales can be performed faster and effectively. To
program the touch screen monitor, Windows SDK related to touch screen is first
installed in the computer. Some of the operations can be programmed similar to
mouse click operations. These include gestures such as tap, double tap and press and
hold gestures as these gestures are equivalent to left click, left double click and right
click respectively of the mouse. In this project, a webcam is interfaced to the
program to capture image for recognition of the products. This is very useful for
cashier since the information of the products will be displayed on the screen
automatically once the products are being recognized. Finally, it can be concluded
that the sales can be performed faster and smoothly by applying the multi-touch
gestures and touch screen monitor for cashier.
vi
ABSTRAK
Projek ini melaporkan aplikasi monitor skrin sentuh untuk juruwang yang
berkenaan dengan tetingkap pengaturcaraan skrin sentuh dengan menggunakan
Microsoft Visual Basic dan Windows SDK. Tujuan projek ini adalah untuk
memprogramkan sebuah monitor skrin sentuh yang boleh digunakan oleh juruwang
untuk memudahkankan jualan, serta menyediakan pengurusan dan maklumat stok
kawalan dalam perniagaan. Pada masa yang sama, projek ini adalah untuk
memperbaiki sistem juruwang dengan menggunakan monitor skrin sentuh, tanpa
menggunakan papan kekunci dan tetikus untuk berinteraksi dengan komputer.
Dengan itu, jualan boleh dilakukan dengan lebih cepat dan berkesan. Untuk
memprogramkan monitor skrin sentuh, Windows SDK yang berkaitan dengan skrin
sentuh akan dipasangkan pada komputer terlebih dahulu. Beberapa operasi yang
setara dengan operasi klik tetikus boleh diprogramkan. Ini termasuklah gerakan
seperti sentuh, sentuh ganda, sentuh dan tahan gerakan yang setara dengan klik kiri,
klik kiri ganda dan klik kanan tetikus masing-masing. Dalam projek ini, sebuah
kamera web akan dihubungkan kepada program untuk menangkapkan gambar dan
pengenalan produk. Ini adalah amat berguna untuk juruwang kerana maklumat
produk akan dipaparkan pada paparan secara automatik selepas produk tersebut
dikenali. Akhirnya, dapat disimpulkan bahawa jualan boleh dilakukan dengan lebih
cepat dan lancar setelah menerapkan gerakan pelbagai sentuh dan monitor skrin
sentuh untuk juruwang.
vii
TABLE OF CONTENTS
DECLARATION ii
DEDICATION iii
ACKNOWLEDGEMENT iv
ABSTRACT v
ABSTRAK vi
TABLE OF CONTENTS vii
LIST OF TABLES x
LIST OF FIGURES xi
LIST OF ABBREVIATIONS xiii
LIST OF APPENDICES xiv
1 INTRODUCTION 1
1.1 Background of touch screen 1
1.2 Problem statement 2
1.3 Objectives 3
1.4 Scope of the project 3
1.5 Outline of Thesis 4
1.6 Summary of works 4
2 LITERATURE REVIEW 6
2.1 Introduction 6
2.2 History of touch screen technology 6
2.3 Construction of touch screen 7
2.3.1 Touch Sensor 8
viii
2.3.2 Controller 8
2.3.3 Software Driver 8
2.4 Touch screen technology 9
2.4.1 Resistive touch screen 9
2.4.2 Capacitive touch screen 10
2.4.3 Surface Acoustic Wave 11
2.4.4 Infrared 12
2.5 Advantages and disadvantages of touch screen 12
2.6 Touch screen applications 13
2.6.1 Cashier system 13
2.6.2 Public information displays 13
2.6.3 Customer self-services 14
2.6.4 Other applications 14
2.7 Windows API 15
2.8 Graphical User Interface (GUI) 16
2.9 Event-driven Programming 16
2.10 Microsoft Windows SDK 17
2.11 Architectural overview of touch 18
3 METHODOLOGY 20
3.1 Hardware and software 20
3.1.1 Touch screen monitor 20
3.2 Flow process 22
3.3 Graphical User Interface 24
3.4 Windows Programming 25
3.4.1 Multi-touch Gestures 25
3.4.2 Plot histogram of the image 29
3.4.3 Create database using Microsoft Access 30
3.4.4 Webcam interface 32
3.4.5 Calculator 33
3.4.6 Cashier or manager login 36
3.4.7 Change username or password 37
ix
REFERENCES 61
APPENDICES A B 64
x
LIST OF TABLES
LIST OF FIGURES
LIST OF ABBREVIATIONS
LIST OF APPENDICES
CHAPTER 1
INTRODUCTION
Touch screen are used in a wide variety of applications such as cashier system,
information kiosks, Automatic Teller Machine (ATM), and system designed to help
individuals who have difficulty in manipulating a mouse or keyboard. Touch screen
technology can also be used as an alternative user interface with application that
usually requires a mouse. Some applications are designed specifically for touch
screen technology, frequently having larger icons than typical computer applications.
2
Nowadays, there are a lot of problems faced by the cashiers in shopping malls,
supermarkets, restaurants and many other places. The main problem faced by
cashiers is they need external keyboard and mouse to interface with the computer. It
causes the space of the table become limited. The sale process is slow because of
many windows need to go through for certain functions. Besides that, the programs
only have few basic operations and less graphic interfaces.
With the rise of touch screen technology, touch screen monitors are very
useful for the cashiers to ensure that the sales can be performed faster and effectively.
In this project, a touch screen monitor which can be used for cashier is programmed.
By using the touch screen monitor, no external keyboard and mouse are required.
This is because touch screen monitor can acts as both an input and output device.
User can communicate with the computer effectively without using any external
keyboard and mouse. Thus, the space is reduced. On the other hand, touch screen
monitor provides shortcut and multi-touch functions, so that the users no need to go
through many windows for certain functions. Therefore, the sale process can be
performed faster.
3
1.3 Objectives
In order to achieve the objectives of this project, there are several scope had
been outlined. The scope of this project is mainly focus on the graphical user
interface (GUI) and Windows programming of the touch screen. In this project,
Microsoft Visual Basic 2010 and Windows SDK are used to program a touch screen
monitor which support multi-touch functions. Besides that, a lot of graphic
interfaces such as plotting histogram, multimedia, edit database, username and
password settings and some other applications other than calculation of price will be
developed. Last but not least, a webcam is interfaced to the program to capture
image of the product and plot the histogram of the image.
4
Gantt charts as shown in Figure 1.1 and Figure 1.2 show the detail of the
works of the project that have been implemented in the first and second semester.
5
Week 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
Activities
Research on
FYP topic
Literature
review
Program for
mouse click
Presentation
Report
writing
Figure 1.1 Gantt Chart of the project schedule for semester 1
Week 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16
Activities
Literature
review
Program for
mouse click
Program for
multi-touch
Interface
with
webcam
Presentation
Report
writing
Figure 1.2 Gantt Chart of the project schedule for semester 2
6
CHAPTER 2
2.1 Introduction
This chapter includes the study of the history of touch screen, construction of
touch screen, touch screen technology, touch screen advantages and disadvantages,
touch screen applications, Windows API, Graphical User Interface, event-driven
programming and Windows SDK.
The history of touch screen technology begin in 1971 when the first touch
sensor called the Elograph was invented by Doctor Sam Hurst while he was an
instructor at the University of Kentucky [2]. The touch sensor developed was not
transparent like the modern touch screen. In 1972, Hurst patented his sensor and
used it as the main point of a new business called Elographics [2].
7
A basic touch screen has three main components, which are a touch sensor, a
controller, and a software driver as shown in Figure 2.1. The touch screen is an input
device, so it needs to be combined with a computer and a display or another device to
make a complete touch input system.
8
A touch sensor is a clear glass panel with a touch responsive surface [3]. To
ensure the responsive area of the panel covers the viewable area of the video screen,
the touch sensor is placed over a display screen. The sensor usually has an electrical
current flow through it. When touching the screen, it causes a voltage change which
is used to determine the location of the touch on the screen. The touch screen panel
registers touch events and passes the signals to the controller.
2.3.2 Controller
A small computer card that connects between the touch sensor and the
computer is called controller. A controller takes information from the touch sensor
and translates it into information that can be understood by computer. The controller
is usually installed inside the monitor for integrated monitors and determines the type
of interface needed on the computer. Integrated touch monitors will have an extra
cable connection on the back for the touch screen [3].
The driver is a software update for the computer system that allows the touch
screen to work together with a computer. When a user touches the screen, touch
event is generated. Software driver will tell the computers operating system how to
interpret the touch event information that is sent from the controller. The software
driver translates the touch event into the mouse event. Most of the touch screen
9
drivers nowadays are a mouse-emulation type driver [3]. This means touching the
screen is the equivalent as clicking your mouse at the same location on the screen.
The advantages of mouse-emulation are it allows the touch screen to work with
existing software and allows new applications to be developed without the need for
touch screen specific programming.
Resistive touch screen is a type of touch screen that can be touched by both
finger and stylus. A resistive touch screen consists of normal glass panel that is
covered with a conductive and a resistive metallic layer [4]. When the top layer is
pressed by a finger or stylus, the two metallic layers become connected and current flow.
The change in the electrical current is registered as a touch event and sent to the
controller for processing. The existence of current in a horizontal and vertical line
gives the position (x and y coordinates) of the touch. Once the coordinates are
known, a driver translates the touch event into something that the operating system
can understand, such as a computer mouse driver translates a mouses movements
into a click. The construction of the resistive touch screen is shown in Figure 2.2 and
the name of each label is shown in Table 2.1.
10
Capacitive touch screen is a type of touch screen that is only sensitive to the
touch of finger. A capacitive touch screen panel consists of an insulator such as
glass, coated with a transparent conductor such as Indium Tin Oxide (ITO) [1].
When a user touches the screen with his or her finger, some of the charge is
transferred to the user, thus allowing the computer to recognize the touch.
11
One of the advantages of the capacitive touch screen over the resistive touch
screen is that it can transmits almost 90 percent of the light from the monitor,
whereas the resistive touch screen can only transmits about 75 percent [5]. Thus,
capacitive touch screen has much clearer picture than the resistive touch screen.
Some advantages of surface acoustic wave are high touch resolution and
highest image clarity. Since the wave setup has no metallic layers on the screen, it
can transmit 100 percent of the light and the image is perfectly clear. Therefore,
surface acoustic wave is the best for displaying detailed graphics [6].
2.4.4 Infrared
Infrared touch screen technology uses a small frame around the display with
LED and photo receptors hidden behind an infrared transparent bezel [7]. The
controller pulses the LED to create LED beams. These LED beams cross each other
in horizontal and vertical patterns to help the sensors choose the exact location of the
touch. The main advantage of Infrared touch screen is that the touch can be activated
by anything including finger, gloved hand or stylus. This type of touch screen is
normally used in outdoor applications which cannot rely on a conductor, such as a
bare finger to activate the touch screen [1].
Touch screen system are being used in a wide variety of applications such as
cashier system, public information displays, customer self-services and so on. This
is due to the touch screen is one of the easiest computer interfaces to use.
Information kiosks, tourism displays, and other electronic displays are used
by a lot of people that have a little or no computing experience. The touch screen
interface is easier to use compared to other input devices. A touch screen can help to
get the information easily by allowing users to simply touch the display screen [8].
14
Almost all functions in the Windows API are located in one of the Dynamic
Link Library (DLL) files found in the Windows System Directory. Windows API
allows any Windows-based program to access any API function easily. The bulk of
API functions are found in user32.dll (user interface functions), kernel32.dll
(operating system kernel functions), gdi32.dll (graphics device interface functions)
and shell32.dll (Windows shell functions) [9].
16
A graphical user interface (GUI) is a type of user interface that allows users
to interact with electronic devices with images rather than text commands. GUI can
be used in computers, MP3 players, gaming devices, office equipment and so on [10].
A GUI shows graphics objects such as image, icons, buttons, and scroll bars on the
screen.
There are three types of software development kits which are Microsoft
Windows SDK, Platform SDK, and .NET Framework SDK. The Microsoft
Windows SDK is a set of tools, code samples, compilers, documentation, headers,
and libraries developers that can be used to create applications that run on Microsoft
Windows operating systems using Win32 or .NET Framework programming models
[11].
Figure 2.6 Processing messages for Windows touch input and gestures
Refer to the Figure 2.6, touch-sensitive hardware receives input from a user.
Then, the hardware and the operating system are communicated by a driver. Next,
the operating system generates a WM_TOUCH or WM_GESTURE message that is
then sent to an application's HWND. HWND stands for handle to the windows [12].
CHAPTER 3
METHODOLOGY
In this project, both hardware and software are applied. The software used
includes Microsoft Visual Basic 2010, and Windows SDK while the hardware
involved in this project are a touch screen monitor, a computer and a webcam.
The touch screen monitor used in this project is Acer 23 T231H multi-touch
monitor which is shown in Figure 3.1. The specifications for Acer T231H [13] are
shown in Table 3.1.
21
For Windows XP and Vista users, they will not capable to make full use of
the touch-sensitive features. Windows 7 starter and Home Basic can only recognize
single-touch actions while Windows 7 Ultimate, Enterprise, Professional and Home
Premium able to support multi-touch [13]. Once the monitor is connected to the
computer, it can be detected automatically.
22
After designing the GUI, Microsoft Visual Basic 2010, which is the latest
version of Visual Basic is used to program for the mouse click. Microsoft Visual
Basic 2010 is a full-fledged Object-Oriented Programming Language, so it has
caught up with other OOP languages such as C++, Java, C# and others [19]. Next,
the Windows SDK which will integrated automatically with Visual Basic is installed
to program for the touch screen which supports multi-touch and other functions such
as swipe, flap, drag, tap and flick functions by modifying the program which
programmed by mouse click previously.
Main Window
Password Cashier
Calculator
Verification Password
Play Video Slide Show
Price Manager
Print Receipt Record Notes
calculation Username
Play sound
Safety Manager
Edit Image
Function Password
Web browser
Figure 3.3 shows the Graphical User Interface of this project. This program
can be divided into six sub-modules
sub which are Sale, Plot Histogram, Multimedia,
Edit Database and Item Details, Username and Password Setting and Applications.
Applications
The products that are going to be purchased by the customers will be captured by a
webcam and being recognized. After the products have been recognized, the
information of the products will be taken from database and displayed automatically
on the touch screen monitor.
In order too do sale, cashiers must login with their correct username and
password. The program will calculate the total price of the products and will print
the receipt automatically to the customers. There is also a safety function in this
program which is used to block the windows quickly by using specific multi-touch
multi
function when being robbed by robber. This is to ensure that the robber
ro cant open
the drawer when they rob a supermarket.
25
3.4.1 Multi-touch
touch Gestures
This
his section describes the steps for using multi-touch
touch gestures. Before start
to program multi-touch
touch gestures, Windows SDK, which will integrate automatically
with Microsoft Visual Basic must be installed. Figure 3.4 shows typical steps that
are performed when using Windows Touch Gestures.
G
Set up a window
Handle the Interpret the
for receiving
gesture messages gesture messages
gestures
Figure 3.4
3. Steps for using multi-touch
touch gestures
26
The first step for using multi-touch gestures is to set up a window for
receiving gestures. By default, WM_GESTURE messages are received. Next step is
to handle the gestures messages, which is the same as handling Windows Touch
input messages. Since Win32 is used, WM_GESTURE message can be checked in
window procedure.
The final step is to interpret the gesture messages. Windows Touch gestures
can be interpreted by handling the WM_GESTURE message from the windows
procedure of an application. After handling this message, GESTUREINFO structure
which describes the gestures is retrieved. The structure, GESTUREINFO, has
information about the gesture such as the type of gesture and the location where the
gesture was performed [14]. The GetGestureInfo function is used to interpret a
gesture message into a structure describing the gesture. The GESTUREINFO
structure is retrieved by passing the handle of the gesture information structure to the
GetGestureInfo function [12]. The complete source code of programming multi-
touch gestures is shown in Appendix A.
Table 3.2 shows various identifiers for gestures that have been used in this
project while Figure 3.5 shows the flowchart of programming the multi-touch
gestures. Whenever the user touches the screen with his or her fingers, touch events
are generated. The system will pass the events to the operating system. Event
interpreter inside the operating system will interpret the events and translate the
events into messages which will then pass to the window procedure. When the
window procedure receives a message, it will check the gesture ID of the message.
For example, if the gesture ID is GID_TWOFINGERTAP, then two-finger tap
gesture will be performed. If not, window produce will determine whether the
message is another type of gestures such as pan, zoom, press and tap or rotate gesture.
If the message is not either one of these gestures, it will return to the window.
27
Figure 3.6
3. Process of plotting histogram of an image
This section discusses the way to create a database using Microsoft Access.
All the information of sale is recorded in Microsoft Access as shown in Appendix B.
The source code of connecting to the Access database is shown in Figure 3.8. The
technology used to interact with a database or data source is called ADO.NET. ADO
stands for Active Data Objects [15].
31
To create a database, first a new data source is added to the project. After
that, database code is written in the Visual Basic. The OLE DB connection object is
required since Access database is used in this project [15]. OLE stands for Object
Linking and Embedding. OLE DB objects or called data providers that used in this
project is called "Jet".
Next step is setting a connection string. Two things will be passed to new
Connection Object which are the technology used to do the connection to the
database and where the database is. By referring to the source code in Figure 3.8,
provider technology used to do the connection in this project is JET and the name
of the Access file connected to is database.mdb. After setting a connection string,
the database can be opened. Once opened, the connection has to be closed again.
Data Set is used to store all the information from the database so that the
information can manipulated. The Data Adapter will contacts the Connection Object,
and then executes a query that has been set up. The results of that query are then
32
stored in the Data Set. Structured Query Language or in short called SQL is a way to
query and write to databases, not
no only Access [15]. Data Set can be filled by Data
Adapter with records from a table
t called tblcontact. The data from the database will
be accessed to use in Sale Menu, Edit Database Menu and Item Details Menu.
Hardware
Program GDI
(Webcam)
Figure 3.10 shows the process of capturing the image which will be displayed
on the screen of the monitor. A Device Context (DC) is used to describe the
attributes of text and images that are output to the screen. The actual context is
maintained by GDI. A handle to the Device Context (HDC) is obtained before
output is written and then released after elements have been written [9].
3.4.5 Calculator
This section discusses how to program the calculator that has been designed
in this project. Figure 3.11 shows the flowchart of programming calculator. When
the window form of calculator is displayed, the program will initialize.
The program will check whether there is a button clicked or not. If there is a
button clicked, the program will continue to check whether the button clicked is a
number. If so, the number will be displayed on the text box and stored in an array,
else the program will continue to check is not operator such as addition, subtraction,
multiplication, division, power and square is activated. If this condition is true, the
34
program will further check which operator is activated and the operator will be stored
in an array, else either backspace, clear or off function will be performed.
Finally, the program will compute the result if the operator equal button is
clicked. Once the result has been computed it writes the result to the text box
provided and it either waits for the computation to continue or waits for the user to
start a new computation by inputting an operand, which subsequently clears the
screen and discards the previous result. Otherwise, the user can continue the
computation by inputting an operator (+, -, x, /, pwr, or sqr) followed by
operands and operators after the printed result. As with a normal calculator, the user
has the option to clear the screen and cancel a computation at any point with a click
of the assigned clear button.
35
Figure 3.12 shows the flowchart of the login process. Firstly, the username
and password are obtained from a text file which contains the values of username
and password of the manager or cashier. Text file is used to store the username and
password that have been changed or updated. Next, the program will check whether
there is any text entered in the text box. If this condition is false, notification of no
text entered will be shown on the screen, else the program will continue to check
whether or not the username and password entered are equal to the username and
password obtained from the text file. If the username and password are equal, the
37
login process is successful and new window will be displayed, else notification of
error will be shown.
Figure 3.13 shows the flowchart of changing the username or password. The
username or password is obtained from a text file. The program will check is there
any text entered in the text box. If there is a text, the program will check whether or
not the current username or password entered is equal to the username or password
in the text file and then continue to check whether the new username or password
entered is equal to the confirm username or password entered. If this condition is
true, the new username or password entered will be updated to the text file, else the
notifications of error will be displayed.
38
CHAPTER 4
4.1 Introduction
This chapter will present the results of the project. There are many windows
have been designed and programmed. The GUI of this project was designed based
on the Iteration Design Guide for Touch Screen Application [16]. The way to use
the program and the functions of each window will be presented in this chapter.
Table 4.1 shows the list of touch gestures that have been programmed and the
gestures action. All the gestures are supported in Windows 7. There are five multi-
touch gestures which are pan, press and tap, zoom, rotate and two-finger tap. In table
4.2, it presents about the synthesized equivalent message and where the gestures
have been applied in this project.
40
Figure 4.1 shows the Login Menu of this project. To use this program, the
manager must login with the correct username and password. If the login successful,
Main Menu as shown in Figure 4.2 will be shown, else a message box of asking the
user to re-enter the username and password will be displayed. The user can tap two-
fingers on the screen of this menu at the same time to view the help, which is a
window that shows to the user how to perform multi-touch gestures. Lastly, the user
can apply press and tap gesture to close this window.
43
The Main Menu of this project is shown in Figure 4.2. This program can be
divided into six sub-modules which are Sale, Plot Histogram, Setting, Edit Database
and Item Details, Multimedia and Applications. The user has to select an option to
continue by clicking the icon in the Main Menu. To close the window, perform press
and tap gesture.
44
Figure 4.3 shows the Cahier Registration Menu. In order to do sale, cashiers
must register themselves by entering cashier name, cashier identification number,
and password in the text box provided. Cashier name and cahier identification
number entered will be passed to the Sale Menu once the registration is successful.
Apart from this, the cashiers can pan right to change their username or password if
necessary. To close this window, apply press and tap multi-touch gesture.
45
4.6 Sale
Figure 4.4 shows the Sale Menu that has been designed for cashier. Let say a
customer buy a Green Apple, the cashier has to select Green Apple from the list box.
Then the name, price, quantity, discount and amount of the Green Apple will be
displayed in the data grid view as shown in Figure 4.4. The total price of the
products bought by the customer will be calculated and displayed in the text box
named Total Sale by clicking a button named Total. Next, cashier has to enter the
amount paid by customer and click the Payment button. The total amount that
cashier should return to the customer will be calculated and displayed in the text box
named Return. Thus, the sale process is completed and cashier has to print the
receipt by clicking the Print Receipt button or apply the two-finger tap gesture. The
receipt printed is shown in Figure 4.5. Lastly, click the Next Customer button to
clear all values and do the new sale.
46
If there is robbery occurred, the cashier can perform the rotate multi-touch
gesture by using one finger to pivot around another to block the windows. All the
windows will be hidden and emergency sound will be played until the manager login
the window again to return to the Sale Menu. The Safety Window is shown in
Figure 4.6. This is for safety purpose. Therefore, the robbers are unable to interact
with the windows to rob the money.
Figure 4.7 shows the Plot Histogram Menu. To plot the histogram, first the
cashier has to click the Auto button in the Sale Menu. Then, use a webcam to
capture the image of the product. Image captured will be sent to Plot Histogram
Menu and the histogram of the product will be plotted automatically. The histogram
plotted is able to recognize the product captured. For instance, if the image captured
is a Red Apple, the histogram will be plotted and being recognized as Red Apple.
Finally the information of the Red Apple will be displayed automatically as shown in
Figure 4.7.
48
Figure 4.8 shows the Database Menu, which is used to display, add, update
and delete a record. All the information of each product will be displayed by
clicking the navigation buttons at the bottom of the menu. If there is a new item
arrives in the supermarket, new record should be added. This is done by clicking the
Add New button. All the text box will be cleared and add the information of the new
product in the text box provided. Then click the commit button. A message box will
be shown to inform the user that a new record has been added to the database.
The information of the products in the database can be updated from time to
time by entering the latest information in the text box provided followed by clicking
the Update button. A message box will be shown to inform the user that the
49
information has been updated. The user can click the Delete button to delete a record
from the database. This menu is also linked to the Item Details Menu directly. So,
the user only has to pan right with his or her finger to go to Item Details Menu.
Figure 4.9 shows the Item Details Menu. This menu is used to view the
details of the products such as image, price, barcode, and description by selecting
any product from the list box on the left of the menu. The manager and cashier can
click the image in the Item Details Menu to go to Image Preview Menu which is
shown in Figure 4.10 to view the image. They can put two fingers on the screen and
move the fingers toward or apart from each other to zoom in and out of the image.
Apart from this, user can make a quick drag to the right to go to Database Menu
directly.
51
There are three menus of changing the username and password, which are
Change Manager Username Menu, Change Manager Password Menu, and Change
Cashier Password Menu as shown in Figure 4.11, Figure 4.12 and Figure 4.13
respectively. When the managers or cashiers want to change their username or
password, they should enter their current username or password and new username
or password twice, then click the change username or password button. If the
username or password changed successfully, a message box will shown to tell them
that the username or password has been changed. Besides that, they can pan right or
pan left to go to another desired window.
52
4.11 Multimedia
Under multimedia, there are three menus which are Play Sound Menu, Play
Video Menu and Slide Show Menu as shown in the Figure 4.14, Figure 4.15 and
Figure 4.16 respectively. Normally Play Sound Menu is used by cashiers to play the
price when the customers are blind people. The price that should be paid by the
blind people will be played to them by selecting the correct price from the list box.
If the amount of price that should be paid by the customer is not in the list, cashier
can press their finger on the screen and hold until the blue ring appear, then release
the finger to browse and insert the desired media file. Once a media file is being
played, cashier can adjust the volume by moving the track bar on the right of the
menu. Apart from this, this menu consists of three buttons which are Play, Stop and
Pause button as controller.
54
4.12 Applications
Figure 4.17 shows the calculator that has been designed for cashier to
calculate the total price manually and to double confirm the total price when the
customer unsatisfied with the amount calculated in Sale Menu. The operation of this
calculator is the same as normal calculator. It performs the addition, subtraction,
multiplication, division, power and square functions. Panning with inertia gesture
has been applied in this menu. The user can pan right, left, up and down to go to the
Image Preview Menu, Record Notes Menu, Edit Image Menu and Web Browse
Menu respectively.
57
Figure 4.18 shows the Record Notes Menu that has been designed for
manager or cashier to record some notes. They only have to use one finger to write
the notes on the screen. After that, apply press and hold gesture to save the notes as
image in the computer.
Edit Image Menu is shown in Figure 4.19. This menu is used to edit the
image captured from webcam if necessary before store the image in database. It can
be used to zoom, rotate, flip and crop the image. Figure 4.20 shows the Web
Browser Menu, which is used to online to search information. The functions are
same as normal internet browser like Internet Explorer and Mozilla Firefox.
58
CHAPTER 5
5.1 Conclusion
Besides that, Microsoft Visual Basic 2010 and Windows SDK are suitable for
Windows Programming of touch screen monitor with many graphics interface.
Lastly, webcam can be interfaced using Visual Basic to capture the image for
recognition of the products. This is very useful for cashier since the information of
the products will be displayed on the screen automatically once the products are
being recognized.
60
The complete GUI for cashier system has been designed in this project.
However, it is still cannot be used for cashier because the complete cashier system
have not formed. In future, this project must combined with the image processing
part such as barcode, color, shape and texture recognition of the products to form a
complete system.
The multi-touch gestures that had been programmed in this program are only
support up to two fingers. In future, it is recommended to program more multi-
touch functions which can support more than two fingers at the same time. Thus,
two cashiers can share to use one touch screen monitor during the sales to save the
space and cost.
61
REFERENCES
16. Gerd Waloszek (December 2000). Interaction Design Guide for Touch Screen
Applications. URL
http://www.sapdesignguild.org/resources/tsdesigngl/index.htm (Retrieved on 28th
August 2010)
17. Alan Andrew Vazquez (1990). Touch Screen use on flight Simulator
Instructor/Operator Stations. Masters Thesis, Naval Postgraduate School
Monterey, California.
63
19. Paul Deitel and Harvey Deitel (2010). Visual Basic 2010, How TO Program, 5th
Edition. Prentice Hall
20. Rod Stephens (2008). Visual Basic 2008 Programmers Reference. Wiley
Publishing.
64
APPENDIX A
Imports System.Security.Permissions
Imports System.Runtime.InteropServices
<MarshalAs(UnmanagedType.Struct)>
Friend ptsLocation As POINTS
Public dwInstanceID As Integer
Public dwSequenceID As Integer
Public ullArguments As Int64
Public cbExtraArgs As Integer
End Structure
<DllImport("user32")>
Private Shared Function SetGestureConfig(ByVal hWnd As IntPtr, ByVal
dwReserved As Integer, ByVal cIDs As Integer, ByRef pGestureConfig As
GESTURECONFIG, ByVal cbSize As Integer) As
<MarshalAs(UnmanagedType.Bool)> Boolean
End Function
<DllImport("user32")>
Private Shared Function GetGestureInfo(ByVal hGestureInfo As IntPtr, ByRef
pGestureInfo As GESTUREINFO) As <MarshalAs(UnmanagedType.Bool)>
Boolean
End Function
<SecurityPermission(SecurityAction.Demand)>
Private Sub SetupStructSizes()
_gestureConfigSize = Marshal.SizeOf(New GESTURECONFIG())
_gestureInfoSize = Marshal.SizeOf(New GESTUREINFO())
End Sub
<PermissionSet(SecurityAction.Demand, Name:="FullTrust")>
Protected Overrides Sub WndProc(ByRef m As Message)
Dim handled As Boolean
Case WM_GESTURE
handled = DecodeGesture(m)
Case Else
handled = False
End Select
MyBase.WndProc(m)
If handled Then
Try
m.Result = New IntPtr(1)
Catch excep As Exception
Debug.Print("Could not allocate result ptr")
Debug.Print(excep.ToString())
End Try
End If
End Sub
Try
gi = New GESTUREINFO()
Catch excep As Exception
Debug.Print("Could not allocate resources to decode gesture")
Debug.Print(excep.ToString())
Return False
End Try
gi.cbSize = _gestureInfoSize
Case GID_TWOFINGERTAP
Receipt.Show()
Invalidate()
67
Case GID_ZOOM
Select Case gi.dwFlags
Case GF_BEGIN
iArguments = CInt(Fix(gi.ullArguments And
ULL_ARGUMENTS_BIT_MASK))
first_point.X = gi.ptsLocation.x
first_point.Y = gi.ptsLocation.y
first_point = PointToClient(first_point)
Case Else
second_point.X = gi.ptsLocation.x
second_point.Y = gi.ptsLocation.y
second_point = PointToClient(second_point)
Invalidate()
End Select
Case GID_PAN
Select Case gi.dwFlags
Case GF_BEGIN
first_point.X = gi.ptsLocation.x
first_point.Y = gi.ptsLocation.y
first_point = PointToClient(first_point)
Case Else
second_point.X = gi.ptsLocation.x
second_point.Y = gi.ptsLocation.y
second_point = PointToClient(second_point)
Invalidate()
End Select
Case GID_PRESSANDTAP
If gi.dwFlags = GF_BEGIN Then
Invalidate()
End If
Case GID_ROTATE
Select Case gi.dwFlags
Case GF_BEGIN
iArguments = 0
Case Else
first_point.X = gi.ptsLocation.x
first_point.Y = gi.ptsLocation.y
first_point = PointToClient(first_point)
Invalidate()
End Select
End Select
Return True
End Function
68
APPENDIX B
ACCESS DATABASE