Escolar Documentos
Profissional Documentos
Cultura Documentos
By
David Vassallo
A DISSERTATION
Submitted to
MASTER OF SCIENCE
in
COMPUTER INFORMATION SECURITY
December 2016
ABSTRACT
By
David Vassallo
tient deaths even in modern, first world countries. In this thesis we discuss a possible technical
solution to this problem via the use of vein patterns as a biometric for identification and RFID as
a technology to enable patient location tracking. We demonstrate that the proposed system is
capable of identifying patients with a high accuracy (around 91%) and subsequently reliably lo-
cating patients within a medical facility. Further, we show that both patients and doctors were
very interested in using the system and found it very easy to use without being intrusive.
The proposed system uses a Raspberry PI based minicomputer to take near infrared images of
a patients wrist area. The infrared images expose vein structures which are then fed into an
image enhancing algorithm. Enhanced images are then processed by a sparse-coding algo-
rithm that decomposes an image into its sparse vectors. Euclidean distance between these
sparse vectors and template vectors is then used to identify which patient is being pho-
tographed. A RFID tag is subsequently attached to a patient, who can be tracked via a network
of RFID antennas and readers. This information is present to health-care professionals via a
web-based interface.
The thesis focuses on biometric and RFID use in a healthcare setting, however the concepts
and techniques used can be used in several other security applications such as remote bank-
I hereby certify that this dissertation constitutes my own product, that where the language of others is set
forth, quotation marks so indicate, and that appropriate credit is given where I have used the language,
I declare that the dissertation describes original work that has not previously been presented for the award of
Signed,
David Vassallo
This dissertation contains material that is confidential and/or commercially sensitive. It is included here on
the understanding that this will not be revealed to any person not involved in the assessment process.
I would like to acknowledge several important people who made this dissertation possible:
Mr Ivan Bartolo, the CEO of 6PM PLC, who provided sponsorship and premises for this project and
unwavering confidence in me
Mr Brian Zarb Adami, the CTO of 6PM PLC, who provided sound advice and ideas on the imple-
Dr Lalit Garg, my supervisor at UoL, who provided invaluable advice and help with the dissertation
Mr Robert Grech, R&D engineer at 6PM PLC, who provided invaluable assistance and advice dur-
My family and friends, for putting up with my long hours and absences due to the demands of this
project
Participants of the study, for allowing me to involve them in the study often at short notice, as well
as the interest they expressed and the important feedback they gave.
TABLE OF CONTENTS
Page
ABSTRACT.............................................................................................................................................2
DECLARATION......................................................................................................................................3
ACKNOWLEDGMENTS........................................................................................................................4
LIST OF TABLES...................................................................................................................................7
LIST OF FIGURES.................................................................................................................................8
Chapter 1. Introduction...........................................................................................................................1
1.1 Problem Statement...........................................................................................................................1
1.2 Scope................................................................................................................................................3
1.3 Approach..........................................................................................................................................4
1.4 Outcome...........................................................................................................................................5
Chapter 2. Background and review of Literature....................................................................................6
2.1 Background......................................................................................................................................6
2.2 Literature Review.............................................................................................................................7
2.2.1 Patient Identification Errors.......................................................................................................7
2.2.2 Current approaches to the problem............................................................................................8
2.2.3 Biometrics................................................................................................................................12
2.2.4 Biometric Features...................................................................................................................16
2.2.5 Vein pattern use cases..............................................................................................................18
2.2.6 Biometric systems implementation in literature......................................................................20
2.2.7 RFID.........................................................................................................................................22
2.3 Theory............................................................................................................................................24
..........................................................................................................................................................25
2.3.1 Theoretical Framework............................................................................................................25
2.3.1.1 Patient Wrist.......................................................................................................................25
2.3.1.2 NIR Camera - Vein Pattern Capture...................................................................................25
2.3.1.3 Image Enhancement...........................................................................................................26
2.3.1.4 Training Phase....................................................................................................................27
2.3.1.5 Template Database.............................................................................................................27
2.3.1.6 Sparse Coding Algorithm (training phase).........................................................................28
2.3.1.7 Template Sparse Representations.......................................................................................30
2.3.1.8 Sparse Coding algorithm (identification phase).................................................................31
2.3.1.9 Image Sparse representations.............................................................................................31
2.3.1.10 Comparison Algorithm.....................................................................................................31
2.3.1.11 Patient Identity.................................................................................................................33
2.3.1.12 Available RFID Tags........................................................................................................34
2.3.1.13 Track Patient Through RFID structure.............................................................................34
2.3.1.14 Monitor patient via web UI..............................................................................................34
2.4 Terms and Definitions....................................................................................................................35
Chapter 3. Analysis and Design............................................................................................................37
3.1 Introduction....................................................................................................................................37
3.2 Experimental Design......................................................................................................................38
3.3 Hardware Components...................................................................................................................40
3.3.1 Vein Pattern Capture................................................................................................................40
3.3.2 RFID Infrastructure..................................................................................................................42
3.4 Software Components....................................................................................................................44
3.4.1 High Level Architecture...........................................................................................................45
3.4.1.1 Training Phase....................................................................................................................45
3.4.1.2 Identification Phase............................................................................................................47
3.4.1.3 Image Enhancement...........................................................................................................49
3.4.1.4 Mapping biometrics to RFID.............................................................................................51
3.4.1.5 RFID Software...................................................................................................................52
3.4.1.6 RFID LLRP Listener..........................................................................................................53
3.4.1.7 Filtering and storing the RFID reads..................................................................................55
3.4.1.8 Mapping RFID information to physical location...............................................................56
V
3.4.1.9 The Web UI........................................................................................................................59
3.5 Qualitative Analysis: Surveys........................................................................................................61
3.5.1 End User (Patient) Survey Questions.......................................................................................62
4.5.2. Expert User (Healthcare workers) Survey Questions.............................................................65
Chapter 4. Implementation....................................................................................................................69
4.1 Introduction....................................................................................................................................69
4.2 Hardware Implementation..............................................................................................................69
4.2.1 Vein Pattern Capture................................................................................................................70
4.2.2 RFID Infrastructure..................................................................................................................76
4.3 Software Implementation...............................................................................................................79
4.3.1 Back-end Implementation........................................................................................................80
4.3.1.1 Biometric functions............................................................................................................81
4.3.1.2 Image Enhancement...........................................................................................................81
4.3.1.3 Training phase....................................................................................................................83
4.3.1.4 Identification phase............................................................................................................86
4.3.1.5 RFID functions...................................................................................................................88
4.3.1.6 Patient functions.................................................................................................................89
4.3.1.7 Render functions................................................................................................................89
4.3.2 Front-end Implementation........................................................................................................90
4.3.2.1 Administrator / Operator front-end....................................................................................90
4.3.2.2 Vein Scanner front-end.......................................................................................................93
4.4 Survey Implementation..................................................................................................................96
4.4.1 Recruitment Plan......................................................................................................................97
4.4.1.1 Recruitment plan for end users..........................................................................................97
4.4.1.2 Recruitment plan for healthcare professionals...................................................................98
4.4.2 Delivery of Questionnaires and collection of results...............................................................99
Chapter 5. Testing and Results............................................................................................................100
5.1 Introduction..................................................................................................................................100
5.2 Vein Pattern Capture System........................................................................................................100
5.2.1 Testing Method.......................................................................................................................101
5.3 Results..........................................................................................................................................105
5.3.1 Sample NIR photos................................................................................................................105
5.3.2 Accuracy Results....................................................................................................................107
5.4 RFID Infrastructure......................................................................................................................109
5.4.1 Testing Method.......................................................................................................................109
5.4.2 Results....................................................................................................................................109
5.5 User Experience and Feedback....................................................................................................110
5.5.1 Testing Method.......................................................................................................................110
5.5.2 End User Results....................................................................................................................111
5.5.3 Healthcare Professional User Results....................................................................................113
Chapter 6. Conclusions.......................................................................................................................119
6.1 Lessons Learned...........................................................................................................................119
6.2 Applications.................................................................................................................................121
6.3 Limitations...................................................................................................................................122
6.4 Recommendations & Prospects for Future Research / Work......................................................123
REFERENCES CITED........................................................................................................................125
APPENDICES......................................................................................................................................134
Appendix A. DS Proposal......................................................................................................................134
Appendix B. User Interface Screenshots...............................................................................................145
Appendix C. Code Listing.....................................................................................................................149
VI
LIST OF TABLES
Page
Table 1: Summary of the strengths and weaknesses of each approach with respect to patient identifica-
tion...........................................................................................................................................................10
Table 2: Summary of strengths and weaknesses in different biometric approaches................................14
Table 3: Summary of Vein Pattern use cases...........................................................................................19
Table 4: Summary of comparison algorithms..........................................................................................32
Table 5: Terms and Definitions................................................................................................................35
Table 6: Summary of experimental designs for technology validation (Zelkowitz and Wallace, 1998,
p.5)...........................................................................................................................................................38
Table 7: Sample RFID to Location Mapping...........................................................................................58
Table 8: Implementation differences between Euclidean Distance and SGD classifiers.........................86
Table 9: Evolutionary Testing of Dictionary Learning Algorithm.........................................................103
Table 10: Required hardware, provider and associated costs................................................................141
Table 11: High Level Timetable, Milestones & Tasks...........................................................................141
Table 12: Project Risk Assessment........................................................................................................142
VII
LIST OF FIGURES
Page
Figure 1: BioRFID Components...............................................................................................................4
Figure 2: Percentage of wristband errors by category (Howanitz, Renner, and Walsh, 2002).................9
Figure 3: Theoretical framework............................................................................................................25
Figure 4: Sparse Coding Illustration.......................................................................................................29
Figure 5: Vein Pattern hardware block diagram......................................................................................41
Figure 6: RFID Hardware block diagram...............................................................................................42
Figure 7: Training Phase Data Flow Diagram........................................................................................45
Figure 8: Identification phase DFD........................................................................................................47
Figure 9: Image Enhancement DFD.......................................................................................................49
Figure 10: Mappings table schema.........................................................................................................51
Figure 11: RFID Software DFD..............................................................................................................52
Figure 12: Decision flowchart for filtering tag reads..............................................................................56
Figure 13: Example RFID reader and antennas placement in a medical clinic......................................57
Figure 14: Web UI Storyboard................................................................................................................59
Figure 15: End User / Patient Questionnaire..........................................................................................64
Figure 16: Medical Professional Questionnaire - Part 1.........................................................................67
Figure 17: Medical Professional Questionnaire - Part 2.........................................................................68
Figure 18: Top-view of the vein capture prototype.................................................................................71
Figure 19: Raspberry Pi and supporting circuitry mounted on the underneath of the arc......................72
Figure 20: Small HDMI screen mounted on the raspberry PI, which can be used to provide visual feed-
back to the users.......................................................................................................................................73
Figure 21: Labelled setup of the NIR vein scanner................................................................................74
Figure 22: The passive RFID tags used in this project...........................................................................76
Figure 23: RFID Antennas......................................................................................................................77
Figure 24: RFID Readers........................................................................................................................78
Figure 25: Software implementation block diagram...............................................................................79
Figure 26: Vein pattern image enhancement algorithm implementation, showing original image (top
left) and the final enhanced image (bottom right)....................................................................................82
Figure 27: The administrator front-end...................................................................................................90
Figure 28: Operator front-end.................................................................................................................92
Figure 29: User Function Menu..............................................................................................................93
Figure 30: Identify functionality results..............................................................................................94
Figure 31: Set of figure showing the data set sample images................................................................106
Figure 32: Average Accuracy on data sets............................................................................................107
Figure 33: End-user reaction to Did the system feel intrusive?.........................................................111
Figure 34: End-user reaction to Was it easy to understand how to use the system?.........................111
Figure 35: End-user reaction to How long did it take to use the system?.........................................112
Figure 36: Healthcare professionals survey results to describe their role.............................................113
Figure 37: Healthcare professionals survey results to rate the system ease of use, from 1 (very difficult)
to 5 (very easy).......................................................................................................................................114
Figure 38: Healthcare professionals survey results to rate the system disruption, from 1 (not disruptive)
to 5 (very disruptive)..............................................................................................................................114
Figure 39: Healthcare professionals survey results to rate difficulty of identifying a patient, before the
system was used, from 1 (difficult) to 5 (easy)......................................................................................115
Figure 40: Healthcare professionals survey results to rate difficulty of identifying a patient, after the
system was used, from 1 (difficult) to 5 (easy)......................................................................................115
Figure 41: Healthcare professionals survey results to rate difficulty of locating a patient, before the
system was used, from 1 (difficult) to 5 (easy)......................................................................................116
Figure 42: Healthcare professionals survey results to rate difficulty of locating a patient, after the sys-
tem was used, from 1 (difficult) to 5 (easy)...........................................................................................116
Figure 43: Healthcare professionals survey results to rate the beneficial impact of the system, from 1
(no impact) to 5 (large impact)...............................................................................................................117
Figure 44: BioRFID Sections................................................................................................................137
VIII
Figure 45: Login page with role selection............................................................................................145
Figure 46: Administrator > RFID Readers Settings Page.....................................................................145
Figure 47: Administrator > Map Locations Settings Page....................................................................146
Figure 48: Administrator > Sparse Dictionary Settings Page...............................................................146
Figure 49: Administrator > Enrollment > Patient Profiles....................................................................147
Figure 50: Administrator > Enrollment > Patient Biometrics...............................................................147
Figure 51: Operator > Audit Screen......................................................................................................148
Figure 52: Operator > Last Screen Screen............................................................................................148
IX
CHAPTER 1. INTRODUCTION
Patient misidentification is a problem in the world's hospitals. Wrong patients and related
issues account for about 4% of medical errors in the US alone (Rosenthal, 2003) and
costs the UK's NHS 466 million a year (NHS England, 2014). While there has been
much research into tracking patients using technology such as Radio Frequency Identifi-
cation [RFID], the majority of patient misidentification occurs during patient identification
the National Patient Safety Agency quoted this problem as a significant risk in the NHS
(Thomas & Evans, 2004). The proposed solution aims to help alleviate the problem of
Hypothesis 1: The vein pattern biometrics significantly increases the ease and ac-
This project attempts to verify the above two hypotheses and build a system that will
tracking system, including both hardware and software system components. Current so-
lutions currently deal with each problem separately (Lahtela, Hassinen, and Jylha, 2008)
1
(Probst et al, 2016). RFID tracking systems are quite mature and well-established, espe-
cially in the retail sector. Biometrics is also quickly maturing, especially with the introduc-
tion of fingerprint, voice and face recognition being incorporated into smartphones. How-
ever, the two fields have not yet been explored in conjunction. Solutions based solely on
RFID still misidentify the patient and cannot guarantee the presence of a patient. On the
other hand solutions based solely on biometrics provide identification but not location
tracking. In addition, the previously mentioned biometric systems (fingerprint, voice and
face recognition) are not particularly suited for a hospital environment since most pa -
tients might have physical or mental conditions that render such biometrics ineffective.
The proposed solution uses vein biometrics to overcome these problems, in conjunction
2
1.2 Scope
The scope of this project is to produce a working prototype, including both hardware and
software as the proof of concept system. The prototype will need to demonstrate:
Identifying a patient using biometrics with a high degree of confidence. This in-
cludes:
Code the software required for image processing and data mining tech-
niques to match the captured vein patterns with known patterns. In addi-
tion the proof of concept will include Web portal to show results to an op-
Writing code to enable the Web portal operator to enroll a patient (i.e.
Subsequently tracking the assigned ID number using an RFID system. This in-
cludes:
writing software that prompts patient to scan their vein patterns, and confirms a
3
1.3 Approach
The actual proof of concept solution consists of two broad categories of tasks, those re-
lating to the hardware of the proposed solution, and those relating to the software. Each
of these categories can be further subdivided into RFID and biometric components, as
shown in Figure 1.
Hardware (Biometrics): A Near Infrared Camera rig that will take pictures of
Software (RFID): Code which receives and parses RFID data from RFID read-
Software (Biometrics): Image enhancing code to extract vein patterns, and ma-
chine learning algorithms to identify which individual the vein pattern belongs to.
When evaluating the proof of concept system quantitative methods based on statistics
will be used in order to test the accuracy of the vein pattern matching algorithms. Statisti -
cal methods such as leave one out cross validation were used to determine the accuracy
4
of the vein pattern biometric system proposed. A total of 33 participants volunteered their
vein patterns, and had their wrists scanned by the proof of concept system. In addition,
we use qualitative methods such as issuing questionnaires to both end users (patients)
and expert users (health-care workers) to evaluate if the system helps reduce identifica-
tion errors, is easy to use and helps in day-to-day tasks. The questionnaires were distrib-
uted online to users, and the anonymous answers were then statistically analyzed with
confidence levels of 95%. Throughout the project, a Lessons Learned experimental de-
sign was followed where we iterated over results to continuously improve the proof of
1.4 Outcome
This project demonstrates that vein pattern biometrics can be used to identify patients
with up to 91% accuracy using very affordable off-the-shelf components. The resulting
proof of concept system can successfully identify the patient and track their location us-
ing an RFID network. In addition, we demonstrate that the majority of participants who
used the system felt very comfortable doing so; finding it non-intrusive to use and very
easy to understand the system. Healthcare professionals surveyed found the system
easy to use and non-intrusive. They also believed that the system would make it easy to
The following chapters will proceed as follows. We will present a comprehensive litera-
ture review of biometrics and tracking technologies as well as review related work in
Chapter 2. Chapters 3 and 4 deal with the analysis, design, and implementation of the
system. In Chapter 5 we present the results of the proof of concept system, and we con-
5
CHAPTER 2. BACKGROUND AND REVIEW OF LITERATURE
2.1 Background
The core issue being tackled by this dissertation is increasing patient safety in healthcare envi-
ronments. A recent paper by Makary and Daniel (2016) indicates that medical errors are now
the 3rd leading cause of death in the United States [US] alone. Other parts of the world report
facing the same problem. For example, the National Patient Safety Agency identified this issue
as a significant risk in the NHS (Thomas & Evans, 2004). Medical errors is quite a generic
contributor to medical errors. In a 2012 survey conducted by the College of Healthcare Informa-
tion Management Executive [CHIME] in the US, 20% of respondents could attribute at least
one adverse medical event to patient identification or matching mistakes (Probs and Branzell,
2016). An executive brief from the highly respected ECRI Institute - which deals with patient
safety - lists patient misidentification in second place in the Top 10 Patient Safety Concerns for
remain under-reported, with medical literature not properly discussing protocols and procedures
6
2.2 Literature Review
As we just saw, patient misidentification has been classified as a serious problem in todays
healthcare environments. What are the real world implications of patient misidentification, and
Medical literature routinely exposes the need for substantial changes in the delivery of health -
care. Medical errors result in at least 44,000 unnecessary deaths each year in the United
States, with the most vulnerable patients such as the old or chronically ill bearing the brunt of
these errors (Weingart, et al, 2000). In the UK, around 5% of patients admitted annually experi-
enced some kind of medical error, which in turn has a measurable economic impact - costing
around 1 billion in extra bed days (Murphy and Kay, 2004). While medical errors take place in
many aspects of healthcare, such as diagnostic and surgical procedures, adverse drug reac-
tions and laboratory tests - accurate and efficient patient identification is a critical aspect in all
of these procedures. For example, particularly in blood transfusion, patient misidentification can
have catastrophic effects. In the blood transfusion setting, patient misidentification is the single
most contributing factor to mistransfusion, with it being frequent enough that the risk of mis-
transfusion is much greater than the transmission of HIV by blood, with the identification
process actually getting worse as time goes by (Murphy and Kay, 2004). Murphy and Kay
[...]
The (conscious) patient is not asked to state their name (and date of birth) and
these are not checked against the same details on the wristband and other written
7
Staff rely on self-identification by the patient
Worldwide, problems like the above mean that millions of people are not afforded the most ba-
sic checks during even routine medical procedures like blood transfusion, with the problem be -
coming more pronounced as time goes by and the healthcare infrastructure becomes more
overloaded.
The oldest approach to the problem of patient identification errors is the patient wristband. Very
often this is a simple piece of paper which was fastened or attached to the patient with hand -
written notes pertaining to the individual such as name, ID, blood type and so on. Handwritten
wristbands soon gave way to printed information, often in the form of 1D or 2D barcodes. How-
8
Figure 2: Percentage of wristband errors by category (Howanitz, Renner, and Walsh, 2002)
It is interesting to note that missing wristband is the leading cause of error. This is a funda-
mental problem of any system that relies on a possession for identification. These systems in-
clude systems that are based on barcode, RFID, NFC, or bluetooth tokens. It is for this reason
that biometrics are very appealing to solve the identification problem since they rely on a physi -
In spite of this, barcodes remain a very popular of patient identification, and they introduce real
benefits. For example, staff typically find barcode identification systems easy to operate, and
preferred it to standard procedures. The barcode procedure also encourages following a stan-
dard procedure, reduces transcription errors, and can automatically record a users actions
(Murphy and Kay, 2004). However, barcodes require line-of-sight to be scanned properly and
hence may not be appropriate in some clinical environments or for locating a patient in an un -
known location.
RFID on the other hand does not require line of sight, and forms part of a class of solutions that
instead rely on radio frequency, such as NFC or bluetooth. Lahtela, Hassinen, and Jylha (2008)
investigate the use of both RFID and NFC (Near Field Communication) in healthcare. As the
name implies, NFC is suited for very short range communication and tracking (typically in the
9
range of a few centimeters). This limitation makes the technology unsuitable for the purposes of
this project which focuses on locating the patient inside a hospital or ward, and hence requires
longer ranges (ideally of several meters). Another interesting technology that researchers have
used is Bluetooth Low Energy [BLE]. BLE can be used to build indoor location services with
an improved accuracy over the traditional WiFi technologies (Faragher and Harle, 2014). We
have tested both RFID and BLE technologies ((Vassallo, 2016a), (Vassallo, 2016b)), and both
seem to be very similar in terms of being capable of locating a patient in a hospital, with some
changes to the hardware setup. However, RFID is significantly cheaper than BLE when tracking
large numbers of patients. RFID tags cost a few cents while BLE beacons currently cost a cou-
ple tens of dollars on average. Therefore for the purposes of this project we will concentrate on
RFID technology.
RFID technology still does not address the problem of missing wristbands. To address this
particular problem, most approaches today rely on biometrics, with healthcare being second
only to the financial industry in the adoption of biometric identification systems (Mordini and Ot -
tolini, 2007). In addition, while patient misidentification can occur at any stage of the healthcare
process, proper patient identification begins with proper patient registration (Schulmeister,
2008), which essentially means correctly identifying a patient. Indeed there are several com-
mercial biometric offerings which focus specifically on patient identification, such as US-based
Table 1: Summary of the strengths and weaknesses of each approach with respect to patient identi-
fication
Advantages Disadvantages
Barcodes - Extremely cheap tags and infra- - Runs the risk of being misplaced
structure
- Murphy and Kay (2004)
10
- Easy to setup and use
- Requires line of sight
- No battery required
- No battery required
- Supports long ranges
RFID
- Runs the risk of being misplaced
- Lahtela, Hassinen, and - Easy to use and setup
Jylha (2008) - Expensive infrastructure
- Mature technology
- Cheap Tags
- No battery required
- Easy to use
- Lahtela, Hassinen, and
Jylha (2008) - Very short range
- Cheap Tags
- Smartphone compatible
- Smartphone compatible
- Relatively new technology
11
- Privacy and security concerns
Table 1 highlights the strengths and weaknesses of different approaches to patient identification
discussed throughout this section. In this project, we propose the use of vein pattern biometrics
and RFID technology to register patients and subsequently identify and track their physical lo -
cation, combining the best aspects of the two technologies. The following sections explore why
these techniques and technologies were chosen, as well as alternatives that may be used in
2.2.3 Biometrics
The term "biometrics" is derived from the Greek words bio (meaning life) and metric (meaning
teristics, and it has become a very popular means of establishing personal identity, especially
with the proliferation of smartphones. The primary advantage of using biometrics as identifica-
tion over other methods such as smartcards (something you have) or passwords (something
you know) is that biometrics cannot be forgotten or misplaced, they are in essence something
that you are (Jain and Jain, 2002). Because of this, biometrics are highly suited to identification
in sensitive or high security areas. For example, biometric identification is very well suited to al -
leviate insider fraud or friendly fraud problems - where impersonators may have legitimate
access passwords or ID cards (such as family members of a patient) or where a user denies
that a legitimate action has been taken (Kahn and Roberds, 2008).
Ideally, to qualify as a valid identification parameter, a biometric trait should have the following
12
Universality - everybody should have the characteristic being measured
Uniqueness - while everyone should have the characteristic, each instance of the
Acceptability - refers to the extent people are willing to use and accept measure-
ment of the characteristic. This is often influenced by factors such as the invasive-
In light of the above, we can now start to judge the applicability of various possible biometric
traits to the healthcare problem domain. Not every accepted biometric is suitable for a given
problem, For example, let us consider three of the most popular biometric traits currently seeing
wide adoption: fingerprints, facial recognition, and voice recognition. These biometrics are cer-
tainly applicable to everyday use by consumers, such as in the banking sector (Fatima, 2011),
however they may not be suitable for a healthcare environment such as the Accident & Emer -
gency [A&E] ward of a hospital. In such a case, it is quite common to have patients who may
have been involved in an incident where their face or fingers have been altered (e.g. a car acci-
dent or beating), have skin conditions making fingerprints unreadable, or even be unconscious
or incoherent hence making voice recognition very difficult. Especially when considering finger-
prints, spoofing attack techniques using electronic ink or gummy fingers are becoming preva-
lent (Galbally-Herrero et al, 2006) - considering healthcare fraud is a billion dollar worldwide
problem, the ease of fingerprint spoofing attacks calls into question the efficacy of the biometric
in the healthcare environment. Table 2 summarizes the advantages and disadvantages of dif-
13
Table 2: Summary of strengths and weaknesses in different biometric approaches
Accuracy Advantages Disadvantages
- Expensive
High
- Jain and Jain (2002)
- Can be extremely ac- - Very short range
- Fatima (2011)
curate
Vein Patterns High - Very difficult to spoof - Better accuracy requires modi-
14
- Internal body feature, fied camera
hence resistant to ex-
ternal obscuring factors
- Fatima (2011) - Possibility of false positives /
negatives
- Liveness check is in-
herent
Heartbeat Signals
from facial video
- Heartbeat signals read in this
- Contactless, hence
manner lack the distinctiveness
Low very unintrusive
- Nasrollahi, K., Haque, capability required to use them
M.A., Irani, R. and Moes- as biometrics
lund, T.B., (2016)
ECG
- Internal body feature, - Takes a relatively long time to
hence resistant to ex- obtain a biometric sample (about
- Lugovaya (2005)
ternal obscuring factors 20-30 seconds)
For these reasons, this dissertation will focus on the use of subcutaneous vein patterns as bio -
metric features.
15
2.2.4 Biometric Features
Universality
This is straightforward since every living person has a vascular system, hence this biometric
feature can be applied to all humans. In practice however, certain individuals may not have any
hands or feet, or have vein patterns which are difficult to read due to fatty tissue. Regardless,
research indicates that vein recognition systems can be used on 99.9% of the population (Wil-
son, 2011), which is especially significant when comparing this to fingerprint systems which are
Uniqueness
While there is no statistical model to quantitatively prove that vein patterns are unique to indi -
viduals, it has been shown that there is a high variety of branching patterns, leading to the
widely held assumption that vein patterns are in fact unique. This assumption has yet to be dis -
proved in finger vein, as well as left and right palm veins (Nadort, 2007)
Permanence
In industry, the only reported variance in vein patterns is that due to the natural growth of hu -
mans. However, Nadort (2007) identifies several diseases and surgery related procedures that
may cause veins to change. That being said, most of the described causes for change do not
affect the area of interest of this dissertation (i.e. the hand wrist region), and hence only would
Collectability.
Vein patterns are collected by visual means. While a variety of methods can be used, such as
X-Ray or ultrasound, the invasive nature of these methods means that the two most common
methods of collecting vein patterns are Near InfraRed [NIR] and Far InfraRed [FIR]. Of the two,
Wang, Leedham, and Cho (2007) note that NIR is more effective at capturing vein patterns and
is more tolerant to environmental changes. On the other hand, the authors also note that NIR is
more susceptible to pattern corruption due to its sensitivity to surface patterns such as skin
16
hair. NIR is absorbed by haemoglobin in the blood, therefore veins appear to be darker in color
Performance.
This dissertation will be based on capturing vein patterns via NIR. NIR cameras are cheap and
easy to source. Several researchers have used cheap and highly customizable embedded sys-
tems such as the Raspberry Pi computer to capture NIR vein pattern images in real time (Joar-
Acceptability
In contrast to other biometrics systems like fingerprint scanning, NIR based systems do not re-
quire contact with the sensor, hence it is considered quite hygienic (Wilson, 2011). Due to its
hygienic nature and non-invasiveness, vein pattern biometrics promote user acceptance
Circumvention
Due to veins being an internal feature of the body, and NIR relies on the presence of blood,
vein patterns are considered to be remarkably difficult to circumvent (Watanabe et al, 2005). In
addition, since vein patterns are hidden beneath the skin, they are difficult to forge (Hashimoto,
2006), especially when compared to fingerprint system, which have recently come under attack
17
2.2.5 Vein pattern use cases
Vein patterns have a variety of use cases, mainly within identification. Recently, vein patterns
have been used for forensic purposes, as was the case with the FBI identifying a terrorist who
committed several crimes against the United States. In 2011, the FBI identified the terrorist as a
journalists beheader by matching veins in his hand with those in a video of the beheader
(Farmer, 2011).
The main use cases for vein patterns remain in the realm of biometrics. The finance industry,
which invests significantly in preventing identity fraud, has already begun using vein patterns to
identify individuals. As far back as 2004, ATMs in Japan began using palm vein patterns to au -
thenticate users (Kallender, 2004). More recently, Europe has followed suit, with ATMs in
Vein pattern technology is not just present in specialist hardware installations such as ATMs.
Again, the finance industry has examples of consumer-grade vein pattern recognition technol-
ogy which banks use to authentication Internet banking customers (Gompertz, 2014). Just as
fingerprints did a few years ago, vein patterns are seeing a number of initiatives placing them
into more consumer products. There are efforts to integrate vein pattern capturing technology
into smartphones (Yalavarthy, Nundy, and Sanyal, 2009), and major electronics vendors like
Samsung are applying for patents to integrate vein pattern recognition in wearables such as
18
Table 3: Summary of Vein Pattern use cases
Author Use Case
Table 3 lists several examples in literature where vein patterns are used for different ap-
plications
19
2.2.6 Biometric systems implementation in literature
Having determined the applicability of vein patterns as a biometric feature, we turn our attention
to the implementation of the system. Several researchers have built NIR image rigs to capture
hand vein patterns based on NIR capable cameras and used a variety of techniques to extract
the actual patterns. In general, biometric systems pass through two implementation phases: en-
rollment and identification. In the first phase, a subject is enrolled into the system by providing
biometric templates which are used in the second phase of identification to match subsequent
biometric samples to the nearest matching template. With vein patterns, the process involves
capturing the image, and applying a number of image enhancement techniques to make the
vein patterns more recognizable (Aboalsamh., Alhashimi, and Mathkour, 2012). After enhancing
the images, features which constitute identifiable material from the image are extracted and
stored. For example, some researchers have used vein bifurcation points as the image feature
(Soni et al, 2010). Others have used thinned vein pattern images, i.e. images where veins
have been reduced to white lines on a black background (Gayathri, Nigel, and Prabakar, 2013).
Once a template image for an individual has been obtained, processed and stored, that individ-
ual can be subsequently identified using a variety of identification techniques. Some studies ap-
proached the identification problem using relatively simple methods, such as minimizing the eu-
clidean distance between feature sets (Soni et al, 2010) (Sahu and Bharathi, 2015) of two bio-
metric images. Other studies compared the two biometric images directly rather than using fea-
tures, and using image similarity measures to identify a subject (Badawi, 2006), similar to im-
age convolution. However more intelligent approaches exist by pursuing the latter strategy of
direct image comparison. In essence, the problem now becomes one of image recognition,
which is widely studied especially in the field of computer vision, so literature suggests a large
number of approaches to the problem, such as K Nearest Neighbor and support vector ma-
chines (Kim, Kim, and Savarese, 2012). For example, one popular technique used in computer
vision is neural networks, and in fact, some researchers have applied neural networks to vein
pattern recognition (Kocer, Tutumlu, and Allahverdi, 2012). Similarly, support vector machines
[SVM] have also been applied to vein pattern recognition with good results (Lee et al, 2010).
20
However, the latter two techniques mentioned have a limitation when applied to the healthcare
environment: the difficult they have to scale efficiently. In the case of more advanced computer
learning algorithms such as neural networks or SVMs, these need to be retrained whenever
new subjects are enrolled in the system. In a healthcare environment which deals with thou-
sands of patients and has a very high patient turnover, this would cripple the performance of
the system. For this reason, the application of sparse representation-based classification
[SRC], also known as sparse coding to the vein pattern recognition problem is of particular in -
terest (Joardar, Chatterjee, and Rakshit, 2015). The concept of sparse coding will be further ex-
plored in the subsequent section 2.3 Theory. In a nutshell, sparse coding does not require re -
21
2.2.7 RFID
Having reviewed the biometric aspect of the project, we now review what happens after a sub-
ject has been identified. In reality, any action can be taken after identification, such as authenti -
cating a patient into an online service, or allowing them access to a restricted area. In the case
of this project, our main aim is to reduce medical errors, therefore we focus on ensuring the
right patient is in the right place at the right time, avoiding misdiagnosis or wrong medication.
RFID (Radio Frequency Identification) is a mature and well-established technology that has al-
Access control.
Inventory Management.
Healthcare.
RFID systems consist of RF-emitting antennas and RFID tags. Tags can be either passive or
active in design, the difference being the former are powered from captured RF energy while
the latter are battery powered (Want, 2006). When powered, the tags emit a unique number
which serves as an identification to whatever item or person the tag is attached to. Hence for
the purposes of this project, it is possible to positively identify a patient using biometrics and
subsequently bind that identity to a unique RFID tag that is issued to the patient. Most industry
RFID systems are based in the Ultra High Frequency [UHF] range, at a frequency of 2.4GHz
With respect to healthcare applications, Yao, Chu and Li (2010) outline the following 5 use
22
Medical mistakes
Increased costs
Theft loss
Drug counterfeiting
Inefficient workflows
Yao, Chu and Li (2010) note that RFID can be used to tackle the above issues by applying the
technology to tracking, identification, and verification. Of particular interest is the authors men-
tion that privacy is one of the main obstacles to more widespread of the technology, which is an
23
2.3 Theory
In order to reduce medical error due to patient misidentification, this project poses two hypothe-
sis:
Hypothesis 1: The vein pattern biometrics significantly increases the ease and accuracy
of patient identification.
Hypothesis 2: Biometric systems can be successfully integrated with existing RFID solu-
tions to track patients, providing an end-to-end identification and tracking platform for
In the previous section we have seen why vein pattern biometrics are a better choice than other
biometric features in a healthcare environment, as well as explored RFID and alternative tech-
nologies to locate a patient once they have been registered in the system. In this section we ex-
plore the theory and high level components behind a system that integrated vein pattern bio -
metrics and RFID in such a way that addresses some privacy concerns relating to both RFID
and biometrics that usually curb the adoption of these technologies. Based on the discussion in
the Literature Review, we identify a theoretical framework for the system as follows in Figure 3
24
Figure 3: Theoretical framework
We decide to use the patient wrist as the region of interest [ROI] which we will acquire NIR im-
ages of. In practice, various parts of the hand can be used to provide the biometric feature,
such as the palm, wrist, and back of the hand, all with positive results (Wang, Leedham, and
Cho, 2007).
As previously mentioned, NIR will be used to capture vein patterns. A NIR image of the vein
patterns is captured in a very similar way to a normal photograph. In order to make images as
similar as possible and avoid scale, skew or rotational problems as much as possible, a physi-
cal guide is built to guide patients on how to place their hands to get the best possible image
of their vein patterns. We drew ideas from previous research, such as the image capture rig
25
2.3.1.3 Image Enhancement
The image enhancement stage is extremely important since in this stage we ensure the veins
are made more prominent by removing noise and increasing contrast. In order to do this, we
Image erosion and dilation are a set of mathematical morphology functions (Haralick, Stern -
berg, and Zhuang, 1987) that together increase the contrast of an image. Erosion shrinks bright
areas of the image while enlarging dark areas. Dilation removes the resulting small bright spots
(also known as salt in digital image processing) and connects small dark regions. Therefore
dilation tends to increase the relative area of dark gaps between brighter areas. With respect to
vein patterns, since NIR is absorbed by blood, the dark gaps will equate to the vein them-
The resulting image still is of relatively low contrast, therefore the next image processing step is
image equalization with enhances the contrast in images by spreading out (i.e. making
lighter or darker) the most commonly used intensities. This allows smaller features in the image
to be more easily discernable. In particular, the adaptive histogram variation of this algorithm is
to be used. In this variation, the algorithm calculates the equalization by using a histogram cal -
culated by splitting the image into tiles, rather than using the entire image. Therefore, local de-
tails can therefore be better enhanced even in image regions which are darker or lighter than
the image average. There are a number of variations of the adaptive histogram itself, but in
theory the contrast limited adaptive histogram equalization [CLAHE] technique should pro-
duce images in which the noise content of an image [...] is not excessively enhanced, but in
which sufficient contrast enhancement is provided for the visualization of structures within the
26
The last image pre-processing step is further noise reduction. To this end, we employ a me-
dian filter to the image which in effect smoothes out the image, reducing noise while retaining
edges (Weiss, 2006) which is extremely important for vein pattern recognition.
Every biometric system typically passes through two phases: a training or enrollment phase,
followed by the identification phase. In the enrollment phase, subjects provide samples of their
biometric features to the system so they can be subsequently used in the identification phase.
In this particular case, the biometric features are the image enhanced vein patterns which are
then stored as templates. The templates should ideally be taken in a variety of angles and
light conditions to as closely as possible match conditions that may arise in the identification
phase. In the identification phase, the subjects vein pattern is matched with all stored templates
The template database is a secure storage where vein patterns gathered during enrollment are
stored. It is important that the template database is only accessed by authorized personnel,
since an attacker could insert unauthorized templates or modify templates of existing personnel
However, it is worthwhile noting that due to privacy reasons, it is not advisable to store or even
use the actual vein patterns once the enrollment phase is over. Instead, we should use a trans-
formation of these templates (Prabhakar, Pankanti, and Jain, 2003). Since humans have a very
limited set of biometrics (for example, we only have two wrists), a biometric compromise may
be difficult to recover from. If an attacker got hold of vein patterns which they could use to im -
personate a subject, then that subject would have to re-enroll and again, assuming their other
wrist vein patterns have not already been compromised. Using a transformation of the vein pat-
terns makes it easier to change compromised credentials since the system would only need to
change the transformation being used, not the actual biometric data. This same concept of us-
ing transformations also enhances privacy. If biometric templates are used to identify a subject
directly, different service providers who utilize biometric identification can be leveraged to build
27
a picture of a subjects activities by correlating biometric identification activity across different
service providers; much in the same way that third party cookies in a web browser can build a
complete picture of a subjects browsing habits (Roesner, Kohno, and Wetherall, 2012). By us-
ing transformations, privacy is enhanced since different service providers use different trans-
forms to describe the same individual, hence protecting his or her identity. In the coming sec -
tions below, we outline how we transform biometric templates which are stored in order to en-
In section 2.2.1 the concept of sparse coding was mentioned. Sparse coding attempts to de-
scribe a large vector of inputs by a weighted sum of a number of basis functions. When ap-
plied to images, sparse coding approximates the behaviour of neurons in the brains visual cor-
tex (Lee et al, 2006). During the training phase, given a set of images that (ideally) completely
describe or approximate any subsequent input images, sparse coding will build a dictionary of
these images by decomposing them into their basis functions - or features which make up an
image. Hence sparse coding represents images more efficiently than simple pixels. To better
understand this process, we present an oversimplified example which illustrates the algorithm
(Ng, 2010):
28
Figure 4: Sparse Coding Illustration
In figure 4, the natural images on the left are decomposed into the base functions that are dis-
played in the matrix on the right. Any subsequent test image can be expressed as a function of
the learned bases. The test example in the bottom left is a small region of the test image. In fig-
ure 4, this example can be expressed as components of three learned bases. Hence the entire
x (0.8*36)+(0.3*42)+(0.5*63)
Therefore, the whole test region has been converted to a very succinct representation, and by
With the above example in mind, we now apply the foundations of sparse coding to the biomet-
ric system. The template database is used to build the learned bases in figure 4, also referred
to as the dictionary. There are a number of methods for learning the dictionary, however one
of the most scalable and performant options is to use online batch dictionary learning (Mairal et
al, 2009). Online learning methods can update their learned dictionaries incrementally when
new templates are provided, rather than having to learn the entire dictionary from scratch.
29
2.3.1.7 Template Sparse Representations
The template sparse representation database is the result of passing each biometric template
through the sparse coding algorithm using the learned dictionary. Each template image will be
reduced to its sparse representation, similar to the equation presented in figure 4. Of particular
note is that whenever the batch dictionary learning algorithm is run from scratch, a new dictio-
nary is built, even if the exact same templates are provided as learning material. If the dictio-
nary changes, then so do the sparse representations of the images. Looking at figure 4, we in -
tuitively understand that if the learned bases are changed, then so too must the resulting for -
mula describing the test example image region. This is the basis of the systems privacy and
security features:
Strictly speaking, the system now does not need the actual vein pattern images,
but only their sparse representation. Therefore the template database collected in
feature can be used to address some of the security and privacy concerns of
users.
Since the same template images generate a different dictionary, different biometric
providers will have different sparse representations for the same users, increasing
their privacy
require a new set of biometric vein patterns. All that is required is retraining the
process, it is still much more preferable than requiring users to submit a new set of
sets of wrist vein patterns. Note that it is imperative that the actual template data-
base (as opposed to the template sparse representation database) is not compro-
mised since we then require a completely new set of biometric vein patterns.
30
2.3.1.8 Sparse Coding algorithm (identification phase)
During the identification phase, a subject that needs to be identified presents their wrist to cap -
ture their vein patterns. This biometric sample is then passed through the image enhancement
techniques and the sparse coding algorithm to obtain its sparse representation. The dictionary
The resulting image sparse representation from section 2.3.1.8 is now used as the subjects
vein pattern signature. The vein pattern signature is hence reduced to a 2 dimensional sparse
matrix.
Once the system obtained the sparse representation of the subjects vein patterns, it proceeds
to make a one-to-many comparison of the sparse representation against all stored template
sparse representations. This is potentially the most time consuming aspect of the identification
process so a good choice of comparison algorithm is essential. The most intuitive comparison
algorithm is to use a distance function which computes the distance between the subjects
sparse representation and all the templates representation. The identity chosen is the individ-
ual whose template produces the minimum distance. Two popular distance functions are Eu-
clidean distances and Cosine Angle distances, both of which have similar performance charac-
teristics (Qian et al, 2004). The advantage of these simple distance functions is that they are
easily parallelizable and hence can be distributed over a large number of computers or servers
Another option for the comparison algorithm is to treat the problem as a classification problem,
where each set of sparse codes representing an individuals vein pattern templates can be
placed into a single class, where a class is an individual. Classification problems are well
studied in artificial intelligence research, and the reduction of biometric templates into sparse
codes makes the problem similar to text classification (another sparse feature problem), which
is also widely studied, hence we have a number of options to explore, notably algorithms such
as Support Vector Machines, k-nearest neighbor, and ridge regression (Yang, Zhang, and
31
Kisiel, 2003). However, as suggested in previous sections, plain vanilla implementations (i.e.
without modifications) of these algorithms suffer from scalability issues since they are usually
not easy to train incrementally, or train online. Therefore when a new individual is enrolled into
the system, the comparison algorithm would need to be retrained, which is not feasible for this
project. In fact, when scaling to larger datasets, practical machine learning programming li-
braries suggest using incremental learning classifiers such as Linear Stochastic Gradient De-
scent (Scikit Learn, 2014). Table 4 summarizes the comparison algorithms along with relevant
citations in literature.
- Complex to implement
- Large number of libraries im-
plementing well-researched neu-
ral network architectures (e.g.
- Requires very precise tuning
Neural Networks Googles TensorFlow)
Nearest Neighbor - Easy to use, conceptualize and - Difficult to implement online / in-
implement cremental learning, therefore new
enrollments require retraining
- Yang, Zhang, and Kisiel
32
- Does not scale well to thousands
of data points
The comparison algorithm will have output the nearest guess of who the presented vein pat-
terns belong to. The system will now have identified the patient and if required this can be veri -
33
2.3.1.12 Available RFID Tags
Once the patient identity is verified, the carer assigns an RFID tag from available stock to the
patient, and inputs the unique RFID serial number into the system. At this stage, the system
now equates the patient identity with a particular RFID serial number and a mapping between
RFID and patient identity is created. In other words, the sparse code representing the patients
vein pattern now needs to be mapped to an RFID serial number. It is worth noting that for the
purposes of this project, a one-to-one mapping between sparse code and RFID serial number
code and RFID serial numbers, that is, a single patient can be assigned multiple RFID serial
numbers. Furthermore, for security and privacy reasons it may be desirable that the mapping
server does not store the actual mappings themselves, but rather a transformation function that
returns the correct RFID serial number only if the correct patient sparse code representation is
The patient is given the RFID tag to keep on their person. This is trivial since RFID tags come
in many shapes and sizes, ranging from patient wristbands, to battery assisted tags that can be
attached to wheeled beds. The RFID tag is tracked through RFID antennas placed at strategic
positions around the facility (such as at ward entrances and exits), and the system registers
which antennas have last picked up the RFID tag. The system will mark the patients location
as the antenna which most recently detected the RFID tag with the highest signal strength.
The interaction between the operator and the system all happens through a web portal, not ne-
cessitating any special software installed other than a normal browser. The above information
34
2.4 Terms and Definitions
Table 5 provides a glossary of some of the terms used throughout this disserta-
tion.
Region Of Interest An area in the image that is kept for further processing. The
35
rest of the image is cropped out saving both processing
and storage.
Radio Frequency Identifica- Refers to a small chip that responds to radio frequencies
tion [RFID] (typically at 13.56MHz or 960MHz ) with a unique identifier
In the next chapter we will introduce the experimental design of the system. We outline the
analysis and design decisions taken with respect to both hardware and software used in the
system.
36
CHAPTER 3. ANALYSIS AND DESIGN
3.1 Introduction
This chapter will tackle the experimental design of the project, where we will outline the
design in terms of established experimental models. We then process to discuss the phi-
losophy behind the design and implementation of the project, the objectives to be met by
the design, and why particular design choices were made. The majority of this chapter
will deal with two aspects in the project. The first is the technical aspect regarding the
analysis and design involved in building the proof of concept IT artefact. The second as-
pect is designing the questionnaires that will support the qualitative part of the project
where we query real-life users if the system helps in their healthcare visits and if it re -
In the first sections we will analyse the Lessons Learned experimental model used to
validate the hypothesis posited in this project. As explained in section 3.2, the Lessons
Learned model (Zelkowitz and Wallace, 1998) allows us to examine qualitative data
from completed projects. In the case of this project, the completed project is the working
proof of concept that is designed and developed as described throughout this disserta -
tion, while the qualitative data is obtained from the questionnaires presented in section
3.5. Subsequent sections will analyse the high level architecture [HLA] of the hardware
and software, by describing the individual modules and processes. The HLA is described
via the use of data flow diagrams [DFD] using the notation described by Chen (2009).
Once the technical aspect has been covered, we proceed to the design of the question-
naires and present a qualitative research design including a metasummary of the find-
37
3.2 Experimental Design
There are a number of experimental designs that can be used for validating technology.
Table 6: Summary of experimental designs for technology validation (Zelkowitz and Wal-
lace, 1998, p.5)
For the purposes of this project, we will be using the Lessons Learned experimental
model. Once the hardware and software components have been completed and com-
bined into a working proof of concept IT artefact, the Lessons Learned model requires us
to gather qualitative data in order to examine the efficacy of the project. This qualitative
data is gathered from the questionnaires that are presented in section 3.5. Essentially
the project adheres to the single group, pre and post test experimental design, where a
single group of users (especially healthcare workers) are asked about their experiences
in patient identification before and after the system is used. The surveys are designed to
38
compare how much of an impact the system has on reducing patient identification and
location errors, while still being easy to use and applicable to the majority of use cases
As indicated in table 6, there are two main disadvantages to this approach: the lack of
quantitative data (i.e. having to rely on subjective data) and not being able to control or
constrain all factors. As regards the former, this is an inherent problem when determining
the success of a user-facing computer system. We discuss in section 2.5 that the main
measure of success of a computer system is how well users react to the system, and
how useful they find the system. Different users in the same category of user groups
have different notions of usefulness, for example some users might give priority to de-
sign over function, while others may be vice versa. This also contributes to the latter
problem, that we are unable to constrain all factors. For example, even within the health-
care workers user group, there may be users who are more computer and technically
savvy than others, which may result in more favourable results being obtained from the
more technology literate individuals. In order to mitigate these risks, we first try to recruit
a significant amount of participants in this study, so that these variables will balance each
other out, while randomly selecting individuals to make sure that answers are statistically
relevant.
39
3.3 Hardware Components
The first stage of the technical analysis is determining which hardware components will
be used to actually build the system. As pointed out in previous chapters, the project re-
quires hardware to support two features: the biometric vein capture and the subsequent
RFID tracking of the patient. The primary objectives of the hardware analysis phase are
to:
Keep costs to a minimum, since this will mean lower per-unit costs to the final
product
costs, while also keeping things simple to procure, build, and repair, while reduc-
Wherever possible, use open source hardware unless this conflicts with the
Vein pattern capture involves taking Near InfraRed [NIR] photographs of a particular re-
gion of interest [ROI]. As described in section 2.3.1.1, we intend to use the patient wrist
as a ROI, so a suitable guide is to be built to clearly show end users where to place their
hand so that the NIR camera can capture the ROI properly, without scale or rotation vari -
ances between pictures. The guide can be a simple cardboard, wooden or plastic mould.
The choice of NIR camera is important as it largely affects the quality of the images
taken. Options for the NIR camera range from modifying a DSLR camera (Cardinal,
2013) to using specially made infrared Raspberry Pi cameras, known as the Pi Noir
40
camera (Raspberry Pi, n.d.), and in fact, both options have been used in literature (Soni,
Gupta, Rao and Gupta, 2010; Joardar, Chatterjee and Rakshit, 2015). In keeping with
the objectives outlined in the previous section,we will use the Raspberry Pi computer
coupled with the Pi Noir camera, since it is a much cheaper option (costing about $75)
as well as being open source, off the shelf hardware. Since the Raspberry Pi is a fully-
fledged mini computer that is capable of running Linux, it gives the added flexibility of
performing some of the calculations, such as image enhancement, on-board rather than
adding load to the central server. The Raspberry Pi can optionally also be attached to an
HDMI monitor to give real-time feedback to the end user. In addition, the Raspberry Pi 3
has inbuilt WiFi and ethernet, offering flexible TCP/IP connectivity options. The proposed
The central processing server will perform most of the software tasks such as discussed
41
3.3.2 RFID Infrastructure
The second aspect of the hardware analysis concerns tracking an identified patients lo-
cation via RFID. There are a large number of RFID vendors available on the market. For
the purposes of this project, RFID equipment from Impinj (Impinj, n.d.) was chosen.
While not open source - the hardware platform is proprietary - it is relatively cheap and
widely used. In addition, we have had previous positive experience using Impinj equip -
ment in other projects. In figure 6 depicts the block diagram for the RFID components:
Each RFID reader can have multiple antennas attached to it, each in a different physical
location. The RFID readers chosen are Impinj Speedway Revolution general purpose
readers (Impinj, n.d.), while the antennas used are those recommended for use by Imp-
inj, manufactured by the vendor Laird (Laird, n.d). Each antenna reports the RFID tag
ID, along with its Relative Signal Strength Indicator [RSSI]. The RSSI can be used to de -
42
termine which antenna the RFID tag is closer to, which in turn determines an approxi-
mate physical location for the tag. This information is fed into the RFID reader, which re-
lays this information via TCP/IP to the central processing server. Since the central pro-
cessing server will store both the RFID tag ID and the patient identity via biometric vein
patterns, it can link a patient to a particular RFID tag, and it can then parse the informa-
tion from the RFID readers to present real-time patient location information to healthcare
providers.
In general RFID antennas are placed at natural chokepoints throughout the facility.
Good examples of chokepoints are the entrances and exits of a ward or room, in stair-
wells and corridors. Using at least two RFID antennas, one can also determine the direc-
tion of travel of a patient. For example, two antennas can be mounted on each end of a
corridor, and depending on which antenna reads the tag first, and which reads the tag
second, the patients direction of travel (up or down the corridor) can be easily inferred.
Hence it is important to include timestamp information with every RFID tag read.
43
3.4 Software Components
The software section of the project closely mirrors the structure of the hardware just pre -
sented. Just like the hardware, the software components can also be split by those com-
ponents dealing with the biometric aspect, and those dealing with the RFID aspect. The
software principles (Hunt and Thomas, 2004). The objectives of the software analysis is
The above objectives lead to the choice of Python (Van Rossum, 2007) as the program-
ming language of choice since it is open source, very easy to learn and has a vast
choice of libraries and modules. At the time of writing, Python is a very popular choice
among AI programmers and data scientists, meaning there is an active community, mak-
ing help and quality code easier to find. For example, the Python machine learning li-
brary Scikit-learn (Pedregosa et al, 2011) and its sister library Scikit-image for image
processing (Van Der Walt et al, 2014) provide many of the required operations that will
44
3.4.1 High Level Architecture
In this section we review the high level design of the proposed system and the con-
stituent modules
The software data flows change depending on which phase the system is currently oper-
ating. Before entering into production and identifying patients, the software must first be
trained. It is in this phase that the dictionary used by the sparse coding is built. This
A sample of appropriate users is selected to train the software. Ideally, the sample in-
cludes as diverse a range of users as possible. Additionally, ideally the photos taken of
the users wrist are taken in the same conditions as when the system is run in produc-
45
tion, i.e. with similar lighting conditions. These two prerequisites ensure that the training
is run on a representative data set, giving more accurate results in the subsequent identi-
fication phase.
Photographs are taken of each subjects region of interest (P1), and passed through an
image enhancement algorithm that helps subsequent algorithms better identify the vein
structure (P2). The image enhancement algorithm will be presented in section 3.3.1.3.
The enhanced images are stored in a datastore with appropriate annotation describing
which image belongs to which patient (D1). These are the template images which will be
used to build a sparse coding dictionary (P3) as described in section 2.3.1.6. The sparse
coding dictionary is stored (D2) for subsequent use in the identification phase.
N.B.: The process described so far need only be run once (or at least very infrequently)
on system setup, to obtain the sparse code dictionary. The rest of the process needs to
Once the sparse coding dictionary has been obtained, it is used by the sparse code algo-
rithm to represent any template vein pattern image in terms of the dictionary, i.e. the sys -
tem calculates the sparse representation of the template. Every user who is enrolled pro-
vides a set of template images. These template images (usually three per patient) are
taken when the patient first comes into contact with the system and are taken in different
positions and lighting conditions to as accurately as possible reflect the conditions of the
photograph being taken in subsequent identification stages. The template images are
then converted into their sparse code representation (P4). Strictly speaking, the template
images are no longer required unless the sparse code dictionary changes, which is an
2.3.1.7. The system only needs regular access to the sparse code dictionary (D2) and
the template sparse code representation (D3), not the annotated image template data-
base (D1). Optionally, the sparse codes are used to train a classifier that will be used as
46
a classification algorithm in the identification phase (P5). This step is optional since it de-
pends on the choice of classification algorithm that yields the most accurate results, as
Once the sparse dictionary is built. we proceed to the identification phase, as depicted in
has been successfully enrolled into the system, though enrollment and identification can
occur in parallel.
As can be seen in figure 8, similarly to the training phase, during identification a subjects
wrist is photographed using NIR (P1) and passed through the same image enhancement
algorithm (P2) as explained in section 3.4.1.3. The enhanced image is then operated on
47
by the sparse code algorithm and decomposed into its sparse representation in terms of
the sparse code dictionary (P3) obtained in the training phase (section 3.4.1.1). Again its
worthwhile noting from the privacy and security perspective that from this point on the
system doesnt require the actual biometric image and hence it does not need to be
stored.
Once the sparse code of the subject to be identified has been obtained, the system is re-
quired to match this to known individual sparse codes obtained during enrollment. For
this purpose we use a comparison algorithm (P4) that compares the subjects sparse
code to all known sparse codes in order to identify the individual. The most simplistic
form of doing this is using a distance measure and identifying the individual by selecting
the template which has the least distance between the sparse codes, with the advan-
tages and disadvantages described in section 2.3.1.10. In this case, we will use Eu-
clidean distance as a distance measure. We also investigated supporting the linear sto-
chastic gradient descent [SGD] classifier, which is a more advanced classification algo-
rithm based on regression. The advantage of the SGD classifier over Euclidean distance
is that is should be computationally faster since it doesnt require the system to iterate
over all template sparse codes. The SGD classifier also supports incremental / on-line
training. In the implementation of this project we will compare the two comparison meth -
48
3.4.1.3 Image Enhancement
The image enhancement module is used to process the NIR images, making the vein
patterns more discernible by removing noise, increasing contrast and enhancing the
(Van Der Walt et al, 2014), shown in Figure 9. The results of passing a sample image
The image to be enhanced is output from the NIR camera in JPEG or PNG format. The
first operation taken is to crop the image and convert into grayscale (P1). This reduces
the amount of data that is required to be processed in subsequent stages and ensures
that redundant features like background are not processed. Next, the image is down-
scaled using the local mean algorithm (P2). The local mean algorithm splits the image
into configurable blocks, each of which is replaced by a single pixel whose value is equal
to the mean of the original block. This has the effect of reducing salt and pepper image
noise as well as removing smaller features such as skin pores and small hairs. The
downscaled image is cloned and one copy of the image is dilated (P3) while the other is
eroded (P4). Image erosion and dilation are algorithms associated with image recon-
struction and morphology (Scikit Image, n.d.). Image erosion shrinks bright regions and
49
enlarges dark regions, in practice, connecting dark regions of the image together. On the
other hand, image dilation shrinks dark regions and enlarges bright ones. Veins will ab-
sorb more NIR energy and hence appear as darker regions in the image. By subtracting
the dilated image from the eroded image, we remove all the relatively large bright re -
gions of the image, leaving veins in high contrast. The high contrast image is added back
into the original image to make the veins more visible. This is what the feature recon-
This enhanced image is then equalized using the adaptive histogram technique (P6). Im-
(Scikit Image, n.d.), resulting in images whose histograms follow a linear cumulative dis -
tribution function. There are a number of variations to histogram equalization, the one
[CLAHE] (Zimmerman et al, 1988), which once again splits the images into blocks. For
rather than on a global image histogram, resulting in local details being enhanced even
Last, the equalized image is blurred using a median filter (P7). Blurring is a common im-
age manipulation technique usually used to remove noise, in this case using a median
filter (Weiss, 2006). A median filter simply replaces each pixel with the median value of
its neighbors. In the case of vein patterns, median filter blurring has the effect of slightly
thickening the veins themselves, making them more prominent. The resulting image is
passed into the the training and/or identification algorithms as described in the previous
two sections
50
3.4.1.4 Mapping biometrics to RFID
Once a patients sparse code has been obtained and verified, the system is required to
map this to a single, unique RFID tag for tracking purposes. In this project, a one-to-one
relationship will be enforced between sparse code and RFID serial number, therefore a
simple database table is sufficient to store the mapping information. Figure 10 represents
the proposed table schema (a full databAse schema is shown in the code listing in Ap -
pendix C)
51
3.4.1.5 RFID Software
At this stage, a patient has been identified, assigned an RFID tag, and mapped to that
particular RFID tag serial number. The patient can now be physically tracked as they
move around the premises. Figure 11 depicts the DFD of the RFID software which en-
52
3.4.1.6 RFID LLRP Listener
The standard API to most RFID readers on the market today is the Low Level Reader
Protocol [LLRP] (Krishna and Husak, 2007). LLRP operates over a standard TCP
socket, requiring an RFID server to open communication with the reader, instructing it to
send tag reports back. As stated in section 3.4.2, Impinj has been chosen as the RFID
equipment supplier. Impinj also supply a Software Development Kit [SDK] referred to as
Octane SDK (Impinj, n.d.). Octane SDK provides Java APIs and leverages LLRP, but
significantly simplifies programming RFID readers and receiving RFID tag reports when
Therefore, the RFID server will use Octane SDK to both setup the reader (P1), and listen
for RFID tag reports over LLRP (P2). During reader setup, the RFID server instructs the
RFID reader to send back the following information for each RFID tag read:
The above information will be used by the RFID server to uniquely identify a physical lo -
cation. Since each RFID reader can be connected to more than one antenna, it is impor -
tant to not only identify the reader by its IP address, but also to include the antenna ID.
sections. Ideally, there should be no overlap between the antenna fields so that each lo-
53
The LLRP listener implemented with the Octane SDK (P2) proceeds to buffer the tag
reads into memory (P3). This step alleviates scaling issues, since in a busy environment
tag reads can amount to thousands per second, and buffering tag reads ensures that po-
tentially slower subsequent stages do not result in dropped reads. Apart from scalability
and performance, the buffer also allows us to decouple the LLRP listener from subse -
quent stages, which results in additional flexibility. For example, the LLRP listener that
writes into the buffer can be implemented in Java as required by the Octane SDK, but
subsequent stages that read from the buffer can be implemented in Python to conform
with the rest of the project. As a buffer, the popular in-memory data structure store Redis
is used (Redis, n.d.). Redis is an easy to use, open source program that has multi-lan-
54
3.4.1.7 Filtering and storing the RFID reads
A python program reads the RFID reads from the buffer and performs basic filtering (P4).
The primary functionality at this stage is to avoid huge storage costs by removing dupli-
cate read. A duplicate read is defined as an RFID tag read that contains the same tag ID,
reader IP and antenna ID as the preceding RFID tag read that has the same RFID tag
ID. In such a situation, the same antenna is continuously picking up the same RFID tag,
meaning the patient is not moving. If all these tag reads where to be stored, it would re-
sult in unnecessary storage being wasted on duplicate data. Hence we filter duplicate tag
reads and only store those tag reads which indicate that the tag has changed position.
This decision process is shown in figure 12. Valid tag reads will be stored in a database
55
Figure 12: Decision flowchart for filtering tag reads
Once the RFID tag reads are filtered and stored, we next need to display the information
in a user friendly format to the end user. In order to achieve this the first step is to trans-
late or map a reader IP and antenna ID into a user-friendly location name. Figure 13 fur-
56
Figure 13: Example RFID reader and antennas placement in a medical clinic
Internally, the system only stores tuples in the form (Reader IP : Antenna ID), which is
57
Taking the scenario depicted in Figure 13, the location mapping result would be as
shown in table 7.
192.168.1.1 2 Registration
192.168.1.1 3 Reception
During setup, the above information is entered into the system via a WebUI. the above
design assumes that a fixed IP is given to the RFID reader, which is a reasonable as-
sumption in most of todays networks. The WebUI will show above information to a user,
essentially displaying the current location of a tag, as well as an audit trail showing past
58
3.4.1.9 The Web UI
The final stage of the technical design involves specifying a storyboard specifying how
the user will interact with the system. Figure 14 specifies the storyboard for this system,
where we can see the screens that will be available to the user depending on their role,
The system will define two roles: administrative and regular users. Administrative
users will be able to perform functions such as enroll new patients into the system, re -
build the sparse dictionary, and map rfid reader antennas to locations. Regular users will
be able to identify patients, assign them an RFID tag for subsequent tracking, and view
59
In keeping with the objectives and choices laid out in section 3.4, the Web UI will also be
Python Flask will provide the server and runtime environment for the WebUI. The actual
UI will be written in standard HTML, CSS and Javascript. Writing the UI using web tech -
nologies has the advantage that the UI becomes device agnostic so long as the device
operating system has an Internet browser. This means the web UI of the project will be
accessible via desktop PCs, tablets, and smartphones without having to rewrite any
code.
60
3.5 Qualitative Analysis: Surveys
We now turn our attention to the design of questionnaires that will be used in surveys for two
user groups that will be in contact with the system: patients and healthcare workers. The sur -
veys will be the basis of the qualitative analysis that answers the fundamental question if the
system adds any value or benefit to the current healthcare landscape. As Sandelowski (2004)
points out, qualitative research is a very useful tool when directed by evidence-based practice.
The evidence points towards problems in identifying patients reliably as we saw in Chapter 2,
so investigating solutions to the problem is warranted. However, the main measure of success
of any IT project is how well users react to the system, how useful they find the system and
how helpful the system is to them. In this respect, qualitative analysis is the best way to capture
The qualitative analysis will consist of two independent surveys conducted via questionnaires.
Each questionnaire should not take more than 5 minutes to answer in order to keep inconve -
nience to a minimum and increase response rates. For the same reason, questionnaires will be
kept simple and easy to understand, consisting almost entirely of multiple choice questions.
Following Sandelowski, Barroso and Voils (2007) guidelines to describe our findings from the
surveys, we will use metasummaries to extract results from the questionnaires, including statis -
tics such as frequency effect sizes which measure the effect the system has on both patients
and healthcare providers. This will provide our results for the Lessons Learned experimental
model introduced in section 3.1. In the following two sections we present the two question-
naires that will be used to extract information from users who will come into direct contact with
61
3.5.1 End User (Patient) Survey Questions
The first user group that is surveyed are the end users, or patients. This user group can
1. Have their wrist area scanned in order to extract a valid vein pattern
2. Answer an online survey designed to determine if the system is easy to use and
unintrusive.
The first item is a prerequisite for the entire system, considering that the biometric fea-
ture chosen is vein patterns. However as discussed in Chapter 2, extracting valid vein
patterns is possible on 95% of the human population, leaving very little individuals who
cannot participate in the study for medical reasons. Therefore the potential pool of partic-
ipants is not limited to any particular group, other than being physically able to use the
system. Participants are recruited via a simple mailshot and/or via social network post-
ings. As regards the second item, as previously discussed the success of an IT project
often depends on how easy the system is to use, while a major obstacle to using biomet-
rics from an end users perspective is how intrusive taking the biometric reading is.
Hence the survey will ask how easy the system is to use (essentially getting their wrist
62
The survey will take the form of multiple choice questions, asking the user specific ques-
tions, as follows:
Question 1 directly relates to if the system felt intrusive or not, while questions 2 and 3
relate to ease of use of the system. Two important factors for ease of use are how long
it takes to use the system - with the assumption that longer use times means the system
is more difficult to use - and ease of understanding. The questions are answered by se-
lecting a number between 14 and 5, each number representing varying degrees of diffi-
culty and/or intrusiveness, with 1 being not easy / intrusive at all, and 5 being very easy /
intrusive. Figure 15 shows a screenshot of the online survey as presented to the user.
Question 4 is an optional, text-based answer that prompts the user for any general feed -
back they might have. This question was kept as optional so as to ensure that the user
can complete the questionnaire in as little time as possible, hopefully encouraging partic-
ipation. Users who answer this question may raise concerns that were not thought of
during the design of the system, giving a better chance at more concrete improvement of
the project.
63
Figure 15: End User / Patient Questionnaire.
64
4.5.2. Expert User (Healthcare workers) Survey Questions
The second user group questioned are the expert users that will operate and use the
1. Are involved in the healthcare industry and give care to patients. Examples of
1. Nurses
2. Admissions Staff
4. Medical Researchers
5. Pharmacists
As can be seen above, the system can be used by anyone who comes into contact with
a patient, hence widening the pool of potential candidates to the study. The participants
will be recruited by advertising at their place of work (e.g. hospitals and clinics). There
are no special restrictions on who can participate other than being having an appropriate
65
Similar to the previous section, the questions asked will be answered in a multiple choice
fashion, as can be seen in Figures 16 and 17. The questions asked to the participants
are as follows:
The first question categorizes the candidate by profession, which may be useful in ex-
tracting results in the metasummary of the findings, for example finding a covariance be-
tween industry/profession and a users acceptance of the system. Questions 2-7 are de-
signed to check how easy it is to use the system in its two main functions: identifying and
locating a patient. Question 8 is designed to get an overall indication for how useful the
participants believe the system to be, while question 9 is designed to be a catch-all for is-
sues or feedback which were not foreseen. A preview of the questionnaire is shown in
66
Figure 16: Medical Professional Questionnaire - Part 1
67
Figure 17: Medical Professional Questionnaire - Part 2
In the next chapter, we will present the implementation of the system according to the design
68
CHAPTER 4. IMPLEMENTATION
4.1 Introduction
In this chapter we will outline the implementation of the design described in Chapter 3.
We discuss how we executed the designs and present real-life photos of the prototype
being built and being used, both from a hardware and software perspective. We also
highlight any deviations from the original design ideas and discuss why the decisions
were taken. This chapter also explores the technical details of the prototype build, con -
Code listings and screenshots of the prototype are included in Appendices B and C.
Similarly to Chapter 3, this chapter will be split into three main categories: the implemen-
tation of hardware, software, and the end survey questionnaire. We first discuss the
hardware choices and implementation, for biometrics and RFID. Next we discuss soft-
ware implementation for both the front-end (user facing) and back-end components.
Last, we present the implementation of the questionnaire that records users reaction to
the system.
Hardware was required for two main areas in the prototype: building the biometric vein
capture rig, and sourcing RFID equipment such as RFID antennas and readers to track
69
4.2.1 Vein Pattern Capture
A cheap and easily iterable prototype was needed in order to capture Near Infrared [NIR]
was used to power the prototype, while a Pi Noir camera was used as a NIR sensitive
camera (Raspberry Pi, n.d.). Initially we thought of using a 3D printed chassis for the pro-
totype, however this turned out to be a very expensive option and instead we opted for
cheapest option for the chassis material. However if the prototype is commercialized at
sufficient scale then production should shift to plastic injection moulding as this is
In figure 18 we can see a top-view photo of the cardboard chassis. The chassis has a
box-like structure, with simple screws serving as guides to help the patient place their
hand in the right position. In the photo one can see the faint outline of a hand drawn for
scale.
70
Figure 18: Top-view of the vein capture prototype
The arc like structure going over the tray supports the Raspberry Pi and camera, pic-
tured in subsequent photos. The raspberry pi can be mounted underneath the arc as
71
Figure 19: Raspberry Pi and supporting circuitry mounted on the underneath of the arc.
72
The Raspberry Pi can be mounted on top of the arc which makes it easier to access,
which is especially convenient if the Raspberry PI is fitted with a small HDMI screen as
shown in figure 20, rather than a larger external HDMI monitor. Which option is used de-
Figure 20: Small HDMI screen mounted on the raspberry PI, which can be used to
provide visual feedback to the users.
73
An HDMI monitor is a necessity to give feedback to the users of the platform. For exam-
ple, the screen can output a preview of the image that the camera sees, so more accu-
rate photos can be taken of the wrist veins. The monitor will also output the results of the
patient identified and allows for easily enrolling new patients to the platform while on the
field (i.e. without requiring patients to enroll using different hardware or in a different envi-
ronment).
In figure 21 we can see a clearer picture of the raspberry pi and associated hardware
stuck to the underside of the cardboard arc. The components are labeled as follows:
Raspberry PI and case: The minicomputer powering the setup along with a pro-
tective case. Figure 21 shows the raspberry pi model 2, however this can be up-
74
graded to the more powerful raspberry pi model 3 if more performance is re-
NIR LEDs: The surface mounted Near Infrared LEDs which provide NIR illumi-
Switch to power on NIR LEDs: This is a simple push button switch that
switches on the NIR LEDs to prevent excessive power consumption which can
lead to excessive heat being generated. The switch was included to increase the
model, however this can be upgraded to the raspberry Pi Noir 2 camera should
Switch to trigger photo: The Pi Noir camera is controlled by software which de-
tects when a particular General Purpose Input Output [GPIO] hardware pin on
the raspberry pi is set to ground. This switch toggles the GPIO and hence allows
the user to trigger a photograph (please see subsequent sections for a descrip-
75
4.2.2 RFID Infrastructure
RFID Tags: In this project passive RFID tags were used to track patients. RFID
tags came in a very large variety of physical forms, ranging from simple paper
adhesives as pictured in figure 22, to rubber bracelets and more durable epoxy-
76
RFID antennas: these antennas are powered and are sensitive to the RFID
tags. They pick up the unique ID of any RFID tag in their field. Different antennas
have different field patterns. The particular ones used for the purposes of this
project are general purpose, wide-field antennas from Laird as shown in Figure
23 (Laird, n.d.)
77
RFID readers: these readers from Impinj (Impinj, n.d.), shown in figure 24, are
designed to power and aggregate data from RFID antennas. The reader has
simple firmware on-board which translated the RFID signals from the antennas
into TCP packets that can be sent over a standard TCP/IP network to a server.
The manufacturer also provides a SDK that can be used to facilitate develop-
ment (see subsequent sections for a description and the appendix for a full code
listing)
78
4.3 Software Implementation
The software implementation of this project was developed using an iterative, incremen-
tal software development life cycle (Jacobson et al, 1999). The requirements were split
into several categories and each category was developed incrementally until all func-
tional requirements were validated by testing. Broadly speaking, the requirements were
split into back-end and front-end components as shown in the block diagram in figure 25.
The following sections explore each of the above categories and their sub categories.
79
4.3.1 Back-end Implementation
The backend is in essence a web server which is responsible for serving REST API calls.
These API calls invoke functions that control functionality such as patient and RFID ad -
ministration, building a biometric sparse dictionary and identifying uploaded vein pat-
terns. The Flask python microframework was used to handle the web server functional-
ity, including parsing HTTP requests, and providing a URL router (Grinberg, 2014). The
Flask URL router is responsible for invoking the appropriate function depending on which
HTTP URL was visited by the client. For example, the following code:
@app.route('/echoer', methods=['POST'])
def echoer():
print request.form
return str(request.form)
Will invoke the function echoer whenever an HTTP POST request is sent to /echoer.
In this case the above function will simply echo back and data sent with the POST re-
quest. All the functions presented below follow the same general structure.
80
4.3.1.1 Biometric functions
The following sections describe functions that provide the biometric capability of the
system
Biometric functions essentially handle the training and identification phases previously
described in sections 3.4.1.1 and 3.4.1.2. As a result, the biometric functions also handle
sample vein pattern being passed through the various stages of the image enhancement
81
Figure 26: Vein pattern image enhancement algorithm implementation, showing original image
(top left) and the final enhanced image (bottom right)
82
As can be seen in figure 26, the image enhancement algorithm presented in section
3.4.1.3 Image Enhancement, significantly enhances the vein patterns of the patients
wrist. This is crucial for the training and identification phases since it makes it easier for
With respect to the training phase, the backend provides two API calls:
/buildSparseDict
/uploadPatientTemplate
The first, buildSparseDict, is meant to be used only rarely. The function is envisaged to
ment, when the sparse dictionary is being trained with sample NIR vein pattern
images for the first time (denoted by P3 in Figure 7). Once the sparse dictio-
nary is built, the function re-iterates over the image templates to produce a
sparse code for each one (the process denoted by P4 in Figure 7). Both the
sparse dictionary and template sparse codes (D2 and D3 respectively in Fig-
ure 7) are stored as python dictionary object on the filesystem. In its current
form, the prototype will accept a local file system directory as a container for the
annotated image templates. Each image file should be a JPG file and follows the
identifier can be generic, valid example include using a patients name directly,
83
an ID number or anonymized random number. For example, the following are all
DavidVassallo_1.jpg
way, the training algorithm will know which image template belongs to which pa-
tient.
Depending on the scope of the breach, it is possible for the attacker to have
stolen the templates used to identify patients, or even have stolen the sparse
dictionary to prevent a malicious actor from spoofing patients with existing tem-
plates.
The buildSparseDict is quite a heavy function and depending on the number of template
images included in the directory. The more template images, the more accurate the iden-
tifier however the longer it would take to run. SInce the function is heavy on resources it
is an important design and implementation feature that it does not need to be run often.
The uploadPatientTemplate function can only be called once the sparse dictionary has
been built as outlined above. This function simply accepts a NIR vein pattern image,
along with a patient identifier. The function then calculates the images sparse code rep-
resentation in terms of the previously derived dictionary. The resulting template sparse
codes are associated with the provided patient identifier and stored on disk. This function
84
is quite lightweight and does not consume as many resources as buildSparseDict, hence
85
4.3.1.4 Identification phase
With respect to the identification phase, the backend exposes a single API call for sim -
plicity and ease-of-use. This is the /identifyPatient API call. The API call accepts a single
file containing the photo of the NIR vein pattern, within an HTTP POST request. The
function then derives the images sparse code representation (process P3 in Figure 8),
and passes the results through the classification algorithm (process P4 in Figure 8).
Once this is done, the identification function returns the patient most closely matching
In the design phase, we presented two different methods of classification: Euclidean Dis-
tance Classifier (Qian et al, 2004) and Stochastic Gradient Descent Classifier (Scikit
Learn, (2014)). During implementation testing, both classifiers performed roughly equally
in terms of accurately classifying patients, however the several differences emerged dur-
ing implementation as can be seen in table 8. Due to the differences listed in table 8, the
decision was taken to deviate from the original design plans in chapter 3 and implement
86
ous test runs)
87
4.3.1.5 RFID functions
The back end implementation is responsible for gathering data from the RFID readers as
described in section 3.4.1.5.1 and section 3.4.1.5.2. The Low Level Reader Protocol
since the provided manufacturer SDK was written in Java. In essence, the Java program:
For each reader IP, creates a separate thread which monitors the reported tags
For each tag read, inserts an entry into a Redis First In, First Out [FIFO] memory
stamp>
A separate python script takes over processing at this point, and implements the design
described in section 3.4.1.5.2. The script pops individual entries from the queue and pro-
Convert the RFID reader IP address into the internal RFID reader ID stored in
the database
Checks if the tags current location has changed. If it has changed, it creates a
new row in the relevant SQLite table, otherwise the entry is discarded. These
row entries are subsequently displayed to the user via a web GUI as described
in subsequent sections.
88
4.3.1.6 Patient functions
Patient functions in the backend are mainly limited to simple Create, Read, Update and
Delete [CRUD] functions. This allows the front-end to instruct the server to:
calls)
Bind a patient identity to an RFID code (via the /updatePatientRfid API call)
tient has an associated sparse code or not (via the /getPatientBiometrics API
call).
Render functions are those functions which return HTML, CSS and Javascript to the
client browser that requests them. These functions return the front-end, i.e. the web UI
which the end users interact with. The render functions consist of the following three API
calls:
/ (root URL) : returns the login.html template, where a user selects which role
they would like to assume. Depending on their choice, they will be redirected ei-
functions such as creating patient profiles, building a sparse dictionary and RFID
reader management.
89
4.3.2 Front-end Implementation
The front end is responsible for the interaction between the end user and the system.
The backend server returns HTML, CSS and Javascript code to the users browser
which then renders that code into the front-end. The front-end is mainly written using the
ReactJS framework from Facebook. (Vipul & Sonpatki, 2016). ReactJS was chosen
since it simplified the code, and made it very easy to modularize the code into re-usable
components, while forcing a well defined structure to the code which adheres to current
industry best practices, rather than an adhoc standard. The ReactJS code is stored in
Depending on the role chosen by the end user, the front-end will render either the admin-
istrator or operator console, as can be seen in the annotated screenshots show in figures
27 and 28.
90
Figure 27 is a screenshot of the administrator front-end with the following annotations:
1: The navigation bar allows the administrator to switch between Settings and
one-time action, mainly managing RFID readers, locations and building the
sparse dictionary. The latter is used for more day-to-day tasks, mainly enrolling
the navbar described above. Depending on the navbar option chosen in (1), the
3: The main actions area. Depending on the choices selected in (1) and (2)
above, the main actions area changes to show the appropriate functionality. In
91
Figure 28: Operator front-end
The operator front-end has a similar layout to the administrator front-end previously de-
scribed, albeit with fewer options. In figure 28 we can see a screenshot of the operator
end users. It allows the operator to query the database either by RFID
exactly where a particular patient was last detected by the RFID readers.
2: The main action area. Depending on the user choices in (1) above, the main
92
4.3.2.2 Vein Scanner front-end
The vein scanner front-end is responsible for interacting with the end user, specifically
with patients and with healthcare professionals identifying those patients. Initially during
the design phase it was envisaged that the frontend would be a web-based user inter -
face that resided directly on the Raspberry PI that powered the vein scanners. This de -
sign has changed slightly, with the frontend still residing directly on the Raspberry PI,
specifics when using the library that controls the Raspberry PI Noir camera. This library
allows the developer to specify that a live preview should be displayed on screen, show-
ing a video feed of whatever the camera is viewing. This preview function is being used
to allow patients or healthcare workers more accurately position their wrists so a better
picture can be taken. This is the same principle as looking through a cameras viewfinder
to better align a photograph. Unfortunately this preview function can only display the
video feed within a native operating system window, therefore the vein scanner frontend
was written to use native operating system windows as well for conformity. Figures 29
93
In figure 29 we can see the two main functions that the vein scanner performs: identify -
ing a patient, or enrolling a new patient into the system. Selecting the identify option will
cause a preview window to be displayed, whereupon a patient can place their wrist to be
scanned. Once the picture trigger is taken, the photo is sent to the server in the back-
ground, which in turn attempts to identify the patient and returns its results to the user as
The enroll function performs very similarly to the aforementioned identify procedure,
however before taking a picture, the user is prompted to enter the patient identifier via a
text dialog box. This identifier is subsequently sent to the server along with the picture
taken of the patients wrist vein patterns. This picture is treated as the template image
for the patient specified, at which point the server calculates the sparse codes for the im-
age and stores them under the patients identifier. The patient can then be identified us-
This enroll function is designed to be used on a day-to-day basis to enroll patients who
were not part of the initial sparse dictionary training phase. The only-prerequisite to using
the enroll function is to have already built a sparse code dictionary. While building the
sparse code dictionary via the use of template images is a time-consuming process, en -
94
rolling patients using the front-end just described is very quick (the process takes under
a minute to complete) and hence should have minimal impact on day-to-day operations.
95
4.4 Survey Implementation
As described in section 3.5, two separate surveys were implemented; one questionnaire
aimed at healthcare professionals who would be operating the system. The participants
for each survey were recruited according to the recruitment plans outlined in section
4.4.1.
made to be easy to access, easy to understand, and quick to answer so as to have as lit -
tle impact as possible on participants. As such, the questionnaires are hosted online and
can be accessed from any device so long as it has internet connectivity, so participants
could answer at their own leisure and without any external pressures. The questions are
almost entirely composed of multiple choice questions to make them easy to answer.
The end user survey consists of only 3 compulsory questions while the healthcare
worker survey consists of 8 compulsory questions. The participants are also encouraged
to leave any additional feedback they may have in free-text sections of the questionnaire.
96
4.4.1 Recruitment Plan
- Does the candidate have any damage to both wrists? (if yes, exclude from
study)
4. Accepted candidates will be asked to read, understand, and sign the consent
form.
5. If participants agree to the consent form and sign the document, their name, sur-
name and contact details will be recorded in the participant information sheet
der to setup a convenient time for them to physically meet with me to get their
wrist scanned
97
4.4.1.2 Recruitment plan for healthcare professionals
- Does the candidate have a healthcare role such as admissions clerk, nurse,
4. Accepted candidates will be asked to read, understand, and sign the consent
form.
5. If participants agree to the consent form and sign the document, their name, sur-
name and contact details will be recorded in the participant information sheet
der to setup a convenient time for them to physically meet with me to show them
how the system works and allow them to use it for as long as they wish
der to ask them to fill in the online questionnaire about their experience
98
4.4.2 Delivery of Questionnaires and collection of results
The questionnaires were implemented in and delivered via Google Forms. This allows
participants to access the surveys online and answer the questions in their own time,
without any pressure. Google Forms also conveniently collects and processes the an-
swers to survey questions. Google Forms summarizes answers by providing pie charts
In the next chapter we present the results of the accuracy of the system as well as user
feedback after having used the system, from both an end user and healthcare profes-
99
CHAPTER 5. TESTING AND RESULTS
5.1 Introduction
In this chapter we will present how each major component of the proof of concept was
tested and the results from these tests. The major components under test are as follows:
For each of these components the following sections define our testing methodology and
The vein pattern capture system was the main technical focus of the testing process
since it is entirely custom built without any reliance on 3rd party vendors and is the major
contribution of this project to the project sponsor. Both the hardware and the software of
the vein pattern capture system had to be tested, with the objective of maximising accu-
100
5.2.1 Testing Method
During testing of the vein pattern system, both hardware and software variables needed
to be changed so as to test the limits of the system. In order to replicate different real
world scenarios, the testing for the vein pattern system was split into three, which re -
Data set 1: In this situation participants had NIR photos taken of their wrists in
optimal conditions. The proof of concept was placed in a darkened room, and the
NIR LEDs were kept on throughout the time that the photos were taken, at maxi-
mum brightness, allowing for the maximum amount of exposure possible. In ad-
dition, the participants were instructed to keep their hands very still when taking
the photos.
Data set 2: In this situation participants had NIR photos taken of their wrists in
and the NIR LEDs were kept on throughout the time that the photos were taken,
tion, the participants were instructed to keep their hands still when taking the
photos.
Data set 3: In this situation the participants had the NIR photos taken in less
favourable conditions. The room was well-lit with fluorescent lighting, however
the NIR LEDs were kept on throughout the time that the photos were taken, to
allow for maximum exposure. The participants were instructed that they were
Data set 4: In this situation the participants had the NIR photos taken in even
less favourable conditions. The system was placed in a well lit room and partici-
pants were instructed that they were able to move their wrists as before. How-
ever, this time the NIR LEDs were programmatically switched on just before the
101
photo was taken and switched off immediately after, therefore allowing for only
minimum NIR exposure. The idea behind this scenario was to decrease the
In each data set, four photos were taken from each participant, two from each wrist. We
then employed leave one out cross validation (Refaeilzadeh, Tang, & Liu, 2009), by
designating one photo as a testing photo, and the other three as training photos, and
measuring the accuracy of the recognition system. The process was repeated four times
In order to increase accuracy, all the pictures from the training dataset were used to opti-
mize the parameters used by the sparse dictionary learning algorithm. After training and
testing the system, its accuracy was measured. The accuracy of the system was mea-
sured as the percentage of correctly identified subjects with respect to the total number
A single parameter in the algorithm is changed, and the resulting accuracy of the
Otherwise the change is reset to its previous value and a new parameter is
changed.
102
Table 9 shows a sample of this process to optimize three parameters used internally by
represent an image
Note that table 9 is a sample of the actual accuracy and not the final result. Final accu-
racy measures are presented in Section 5.2.2.2. Note that several items in highlighted in
bold in table 9 to denote which of the parameters were changed in order to optimize the
algorithm. For example, in the second data row, we see that uncropped image is high-
lighted in bold, indicating that this parameter was changed from the previous value
(which was cropped image). In addition, the second row also has yes under keep
mutation? highlighted in bold, meaning that the changed parameter will be retained
since it improved accuracy. In the third row, batch_size was changed from 3 to 10, so it
103
is highlighted in bold, and it also improved accuracy so yes is once again highlighted in
bold. The process continues down the table till accuracy peaks. In each row, only one
parameter is changed in order to make sure that we can effectively keep track of which
Once the parameters were finalised the sparse dictionary learning algorithm was run on
the training set and tested on the test set images. The following section analyses the
104
5.3 Results
Data Set 1
(Dark
Room,
Maximum
NIR expo-
sure)
Data Set 2
(Dark
Room,
Medium
NIR expo-
sure)
105
Data Set 3
(Light
Room,
Maximum
NIR Expo-
sure)
Data Set 4
(Light
Room,
Minimum
NIR Expo-
sure)
Figure 31: Set of figure showing the data set sample images
As can be seen from figure 31, visually the vein pattern can be best observed in the first
dataset, with the smaller vein structure being better visible as well as the major veins.
106
5.3.2 Accuracy Results
Analysis of accuracy of the system revealed that there was no significant changes in ac-
curacy between the last three datasets, however there are marked differences in accu-
Figure 32 illustrates that the first data gave the best results, with an average of 91% ac-
around 1200 (University Hospitals Birmingham NHS Foundation Trust, 2015), we can
conclude that the accuracy of the system including the confidence interval is 91% 9.79
at a confidence level of 95%. This indicates that under proper conditions it is possible to
accurately identify individuals via their wrist vein patterns. It is possible to further improve
107
Adding more NIR LEDs such as the Bright Pi system (Pi Supply, n.d)
Opting for a more expensive but better performing NIR camera such as those of-
If the system is going to be used in a bright area, apply NIR photography filters
Adding more guides for users to consistently present the same wrist patterns to
the system, by using hardware such as a non-reflective plastic mould where they
108
5.4 RFID Infrastructure
The RFID infrastructure was limited due to budget, and consisted of:
Testing on the RFID consisted of ensuring that the antennas could pick up the RFID tags
at the expected range, given that a patient was wearing the tags on their person - such
tion between readers and the system frontend was tested to ensure proper functionality.
5.4.2 Results
The functionality of the frontend worked as expected, however the range of generic RFID
tags varied somewhat. The range of the RFID tags was highly dependent on the orienta-
tion of the user with respect to the antenna. If a clear line of site was available, the RFID
tags could be picked up at a range of approximately 4 meters. However this range was
reduced by 50% if the patient oriented their body in such a way as to have their body be -
tween the tag and the reader. This is to be expected, considering the attenuation a hu -
man body would introduce. Due to this, it is recommended to use especially designed
RFID tags that have more power, often referred to as battery-assisted passive tags
within the industry (CoreRFID, n.d.). In addition, care should be taken to place tags
109
strategically (such as embedded in the front of a patient gown or embedded in a
bracelet).
In this section we evaluate users reactions to using the system. The section will first ex -
plain how reactions were tested, followed by two sections presenting the results. The first
result section will focus on end users who represent patients. Their interaction with the
system is mainly limited to the vein pattern capture system; therefore their responses are
a measure of how well the hardware of the system works. The second result section will
focus on the healthcare professionals who use the system frontend (the Web UI). Their
interaction with the system is mostly centered on using the software of the system to
track patient locations and therefore their responses are a measure of how well the fron -
In order to measure user reactions to the system, questionnaires were used. In the case
of end-users, questionnaires were completed after the users had been asked to use the
were completed after the professionals were given a quick tutorial of the system, how to
In the following sections we present summary statistics of the responses for each of the
questions. We assume an end user population size of 7 billion, which is almost equal to
size of 59,220,000 which is the latest count of worldwide healthcare workers provided by
110
5.5.2 End User Results
Figure 34: End-user reaction to Was it easy to understand how to use the system?
111
Figure 35: End-user reaction to How long did it take to use the system?
Figure 33 shows that the majority of the users felt that the vein scanning system was not
intrusive at all, with 93.9% 8.29 (with a 95% confidence level) of the respondents mark-
ing the minimum of 1 on an intrusiveness scale ranging from 1 to 5. The majority of the
users also thought the system was very easy to understand (72.7% 15.44 with a 95%
confidence level), as shown in Figure 34. It is worth noting that most users were unsure
of how and where to place their hands when having their vein patterns captured, further
reinforcing the suggestion made in Section 5.2.2.2 of incorporating some sort of handle
or mould for users to grasp or place their hand on, reducing their queries and improving
Finally, figure 35 shows that a very large percentage of users (97% 5.91 with a 95%
confidence interval) reported minimal time spent using the system - under 2 minutes. It is
worth noting that in this case the users response includes their experience of enrolling
into the system. After enrollment and during day to day operations, it is expected that the
overwhelming majority of users will report having to use the system for under a minute.
112
5.5.3 Healthcare Professional User Results
figure 36 shows.
113
Figures 37 to 43 present summary statistics of each question they were asked:
Figure 37: Healthcare professionals survey results to rate the system ease of use, from 1 (very difficult) to
5 (very easy)
Figure 38: Healthcare professionals survey results to rate the system disruption, from 1 (not disruptive)
to 5 (very disruptive)
114
Figure 39: Healthcare professionals survey results to rate difficulty of identifying a patient, before the sys-
tem was used, from 1 (difficult) to 5 (easy)
Figure 40: Healthcare professionals survey results to rate difficulty of identifying a patient, after the sys-
tem was used, from 1 (difficult) to 5 (easy)
115
Figure 41: Healthcare professionals survey results to rate difficulty of locating a patient, before the system
was used, from 1 (difficult) to 5 (easy)
Figure 42: Healthcare professionals survey results to rate difficulty of locating a patient, after the system
was used, from 1 (difficult) to 5 (easy)
116
Figure 43: Healthcare professionals survey results to rate the beneficial impact of the system, from 1 (no
impact) to 5 (large impact)
Figure 37 shows that the majority of the healthcare professionals ( 99.9% 11.26 with a 95%
confidence interval) find the proof of concept system very easy to use in general. Figures 39
and 40 show a marked improvement in the doctors assesment of how easy it is to identify a
patient once the system is introduced. Figures 41 and 42 shows the same improvement for
subsequently locating a patient. Most doctors also felt that the system would not disrupt their
workflow as can be seen in figure 38 . The feedback from the doctors was lukewarm about how
beneficial the system would be to their daily work as illustrated in figure 43, however they gave
Certain organizations would regard the system as more useful than others.
pital where patients should be allowed to move around, but under supervision
Certain healthcare roles would find the system much more useful than others.
While doctors and surgeons might not find the system brings that much benefit,
this is mostly because they are generally insulated from tasks like identifying and
locating patients by nurses and admissions staff, who would in turn find this proof
117
The results are very promising and encourage further development of the system when
one considers that the very basic proof of concept system had an excellent reception by
118
CHAPTER 6. CONCLUSIONS
In this project we built a working proof of concept system which successfully identified a
small sample of patients with an average of 91% accuracy. Subsequently these patients
where tracked across a medical facility using RFID technology. Surveys of both patients
and healthcare professionals using the system showed that both groups were very re-
ceptive to using the system and we believe that a product based on the proof of concept
presented here has the potential to alleviate the problem of medical errors due to patient
This project touched on several challenging areas such as image processing, biometrics,
RFID, healthcare and patient safety, and the results are very encouraging, justifying ad-
ditional development of the proof of concept into a fully-fledged product. We can summa-
capable of 91% accuracy and above can be achieved using very cheap (sub-
$200 total cost) off-the-shelf components. However the quality of the hardware
used to take patient vein pattern photographs had a very large impact on the
systems resulting accuracy. The larger the patient population that uses the sys-
include hand guides or handles to make sure the patient wrist is rela-
119
With respect to RFID, results were excellent, however once again we see better
results in terms of RFID read range and RFID read rates as we invest in better
tags. Simple paper RFID tags may be sufficient in some environments, however
With respect to user acceptance, both in terms of end users and healthcare pro-
fessionals the feedback regarding the proof of concept was very good. The proof
of concept needs some more work in terms of user experience with more intu-
itive design and a more professional look, as well as further software develop-
ment to integrate with already existing hospital systems. The more integration
We believe the dissertation has proved that the original two hypothesis (reiterated below)
Hypothesis 1: The vein pattern biometrics significantly increases the ease and ac-
With high accuracy rates achieved even using basic hardware it is easy to see that with
additional investment it is very possible to have vein pattern biometrics identify individu-
als within a hospitals patient population, eliminating the problems with human error and
without the need for patients to carry any additional data such as ID cards.
120
From a technical perspective integrating biometrics with RFID solutions was not prob-
6.2 Applications
The proof of concept illustrates that the concepts used to build the system provide a solid
foundation for any scenarios where identification and tracking are required. In this disser-
tation, identification and tracking have been applied together to build a system that would
help positively identify and subsequently track patients as they move across a medical
facility. Other applications for identification and tracking are rather varied, especially if
one considers the two separately. Apart from the patient identification and tracking sce-
nario described in this dissertation, some healthcare applications where the system can
be used include:
used to not only verify that a patient is who they say they are, but also to make
sure they are present, helping to deter prescription fraud by medical identity
theft. Prescription fraud has been known to affect over 60% of the US population
(Imandoust, n.d.). Tracking can be used to track medicine location and alert on
vein pattern identification and RFID tracking work together to increase physical
security. Not only does a user require a specific RFID tag to access a restricted
area, but they also need to present the appropriate vein pattern.
small, self contained units that perform vein pattern recognition for a particular
user. This user would need to present the appropriate vein pattern to this device,
which in turn releases and authentication token, giving remote network access
121
(such as VPN or web portal sign in) to the user. This can be used in conjunction
6.3 Limitations
The proof of concept also highlighted several issues with the proposed system:
With respect to the vein identification system, the positioning of the wrist was
highly influential on the resulting accuracy. If users are not given proper guidance
on exactly where to place their hands for their wrists to be scanned, the accuracy
reduces dramatically. This need for control over positioning may be an issue in
There is no one size fits all when dealing with RFID tags. For example, an
RFID tag which works well in one environment such tracking a patient in a nor-
mal room will not work in another environment such as tracking a patient lying on
a metal bed (since the metal interferes with RF signals). Careful consideration
must be made as to where and how the system is going to be used in order to
There is currently still some manual intervention required in the system when
pairing patients to RFID tags (i.e. even though a patient is identified automati-
cally, an operator would still need to subsequently manually enter an RFID code
From a user interface perspective the system still lacks many features that a pro-
122
Addition of visual aids such as maps to help in tracking patient move-
ments
The project opens up several avenues of improvement and future research, as outlined
below:
test the system using the new camera and compare this to the old one. Similarly,
other improvements to the camera could be done such as investing in more spe-
input an RFID code to represent an identified patient. The system could reduce
human error by instead prompting the operator to place the RFID tag over a
desktop RFID tag reader and automatically entering the RFID code for the identi-
fied patient.
With respect to the vein pattern recognition system, we identify the following ar-
eas of improvements:
sented in Section 4.3.1.1.1) that would make vein patterns more visible
Testing of multiple image feature sets. The project in its current form
uses image features extracted from sparse coding. Other image feature
123
bined together with the sparse coding features to create a richer and
more stable feature set with which to recognize the vein patterns
124
REFERENCES CITED
Aboalsamh, H.A., Alhashimi, H.T. and Mathkour, H.I., 2012, January. Applying Recent Vein Im-
ference on Image Processing, Computer Vision, and Pattern Recognition (IPCV) (p. 1). The
Abdi, H. and Williams, L.J., 2010. Principal component analysis. Wiley Interdisciplinary Re-
Amazon, n.d. Hoya 52mm RM72 Infrared Filter [online] <Available from:
https://www.amazon.com/Hoya-52mm-RM72-Infrared-Filter/dp/B0000AI1FZ> (Accessed
October 2016)
Badawi, A.M., 2006. Hand Vein Biometric Verification Prototype: A Testing Performance and
Betances, R.I.G. and Huerta, M.K., 2012. A review of automatic patient identification options for
public health care centers with restricted budgets. Online Journal of Public Health Informat-
ics, 4(1).
2016)
Cardinal, D. 2013. How to turn your DSLR into a full spectrum super camera, ExtremeTech,
Chassin, M.R. 2002 The wrong patient, Annals of Internal Medicine, 136(11), p. 826. doi:
10.7326/0003-4819-136-11-200206040-00012.
Chen, Y.L., 2009. Data Flow Diagram. In Modeling and Analysis of Enterprise and Information
http://www.corerfid.com/rfid-technology/rfid-tracking/battery-assisted-passive-tags/> (Ac-
125
Collinson, P. 2014. Forget fingerprints banks are starting to use vein patterns for ATMs, The
http://www.theguardian.com/money/2014/may/14/fingerprints-vein-pattern-scan-atm> (Ac-
E-Con Systems, n.d. See3CAM_12CUNIR - 1.3 MP Monochrome USB NIR Camera [online]
ECRI Institute, 2016. Top 10 Patient Safety Concerns for Healthcare Organizations, Executive
https://www.ecri.org/EmailResources/PSRQ/Top10/2016_Top10_ExecutiveBrief_final.pdf>
Gompertz, S. 2014. Bank customers to sign in with 'finger vein' technology, BBC [online]
Faragher, R. and Harle, R., 2014. An analysis of the accuracy of bluetooth low energy for in-
Farmer, B. 2011. Daniel Pearl was beheaded by 9/11 mastermind , The Telegraph, UK, [on-
http://www.telegraph.co.uk/news/worldnews/asia/afghanistan/8271845/Daniel-Pearl-was-
Fatima, A 2011, 'E-Banking Security Issues -- Is There A Solution in Biometrics?', Journal Of In-
ternet Banking & Commerce, 16, 2, pp. 1-9, Business Source Complete, EBSCOhost,
Garcia, J, & Tapiador, M 2006, 'On the vulnerability of fingerprint verification systems to
Gayathri, S., Nigel, K.G.J. and Prabakar, S., 2013. Low cost hand vein authentication system
on embedded linux platform. Int J Innovative Technol Exploring Eng, 2(4), pp.138-141.
Grinberg, M., 2014. Flask Web Development: Developing Web Applications with Python. "
126
Haralick, R.M., Sternberg, S.R. and Zhuang, X., 1987. Image analysis using mathematical mor-
phology. Pattern Analysis and Machine Intelligence, IEEE Transactions on, (4), pp.532-550.
Hashimoto, J., 2006, June. Finger vein authentication technology and its future. In VLSI Cir-
cuits, 2006. Digest of Technical Papers. 2006 Symposium on (pp. 5-8). IEEE.
Howanitz, P.J., Renner, S.W. and Walsh, M.K., 2002. Continuous wristband monitoring over 2
Hunt, A. and Thomas, D., 2004. OO in one sentence: keep it DRY, shy, and tell the other guy.
Imandoust, S. n.d. Prescription Fraud Resulting From Identity Theft, Identity Theft Resource
Lahtela, A., Hassinen, M. and Jylha, V., 2008, January. RFID and NFC in healthcare: Safety of
hospitals medication care. In Pervasive Computing Technologies for Healthcare, 2008. Per-
Lee, H., Battle, A., Raina, R. and Ng, A.Y., 2006. Efficient sparse coding algorithms. In Ad-
Lee, H.C., Kang, B.J., Lee, E.C. and Park, K.R., 2010. Finger vein recognition using weighted
local binary pattern code based on a support vector machine. Journal of Zhejiang Univer-
Lugovaya T.S. 2005. Biometric human identification based on electrocardiogram. [Master's the-
127
Jain, A. and Jain, A.K. (2002) Biometrics: Personal identification in Networked society. Edited
Jacobson, I., Booch, G., Rumbaugh, J., Rumbaugh, J. and Booch, G., 1999. The unified soft-
Joardar, S., Chatterjee, A. and Rakshit, A. (2015) A real-time palm Dorsa Subcutaneous vein
10.1109/tim.2014.2374713.
Kahn, C.M. and Roberds, W. (2008) Credit and identity theft, Journal of Monetary Economics,
http://www.computerworld.com/article/2566919/security0/japanese-banks-choose-vein-
Kim, J.I.N.H.O., Kim, B.S. and Savarese, S., 2012. Comparing image classification methods: K-
Krishna, P. and Husak, D., 2007. RFID infrastructure. IEEE Communications Magazine, 45(9),
p.4.
Kocer, H.E., Tutumlu, H. and Allahverdi, N., 2012. An Efficient Hand Dorsal Vein Recognition
Based on Neural Networks. Journal of Selcuk University Natural and Applied Science, 1(3),
pp.pp-28.
Mairal, J., Bach, F., Ponce, J. and Sapiro, G., 2009, June. Online dictionary learning for sparse
Makary, M.A. and Daniel, M. (2016) Medical errorthe third leading cause of death in the US,
Mordini, E. and Ottolini, C., 2007. Body identification, biometrics and medicine: ethical and so-
Murphy, M.F. and Kay, J.D.S., 2004. Patient identification: problems and potential solutions.
128
Nadort, A., 2007. The hand vein pattern used as a biometric feature. Master Literature Thesis
Nasrollahi, K., Haque, M.A., Irani, R. and Moeslund, T.B., 2016. Contact-Free Heartbeat Signal
for Human Identification and Forensics. In Handbook of Biometrics for Forensic Science.
Springer.
Newman, M.W. and Landay, J.A., 2000, August. Sitemaps, storyboards, and specifications: a
sketch of Web site design practice. In Proceedings of the 3rd conference on Designing in-
teractive systems: processes, practices, methods, and techniques (pp. 263-274). ACM.
NHS England, 2014. Improving medication error incident reporting and learning, [online]
Ng, A. 2010. ECCV10 Tutorial - Image Classification By Sparse Coding. Presentation, Univer-
Odinaka, I., Lai, P.H., Kaplan, A.D., O'Sullivan, J.A., Sirevaag, E.J. and Rohrbaugh, J.W., 2012.
Paranjape, R.B., Mahovsky, J., Benedicenti, L. and Koles, Z., 2001. The electroencephalogram
Pasfield, G., 1991. Color care coded patient identification system. U.S. Patent 5,026,084.
Patently Mobile, 2016. Samsung invents a new User ID System for Smartwatches using Hand
invents-a-new-user-id-system-for-smartwatches-using-hand-vein-patterns.html> (Accessed
May 2016)
Pedregosa, F., Varoquaux, G., Gramfort, A., Michel, V., Thirion, B., Grisel, O., Blondel, M., Pret-
tenhofer, P., Weiss, R., Dubourg, V. and Vanderplas, J., 2011. Scikit-learn: Machine learn-
Pi Supply, n.d. Bright Pi Bright White and IR Camera Light for Raspberry Pi [online] <Avail-
129
Prabhakar, S., Pankanti, S. and Jain, A.K., 2003. Biometric recognition: Security and privacy
Probs, M. and Branzell, R. (2016) CHIME initiatives advance patient safety - healthcare IT -
Probst, C.A., Wolf, L., Bollini, M. and Xiao, Y., 2016. Human factors engineering approaches to
Prokoski, F., 2000. History, current status, and future of infrared identification. In Computer Vi-
sion Beyond the Visible Spectrum: Methods and Applications, 2000. Proceedings. IEEE
Qian, G., Sural, S., Gu, Y. and Pramanik, S., 2004, March. Similarity between Euclidean and
cosine angle distance for nearest neighbor queries. In Proceedings of the 2004 ACM sym-
2016)
Refaeilzadeh, P., Tang, L. and Liu, H., 2009. Cross-validation. In Encyclopedia of database sys-
Right Patient, n.d. [online] <Available from: http://www.rightpatient.com/> (Accessed May 2016)
Roesner, F., Kohno, T. and Wetherall, D., 2012. Detecting and defending against third-party
tracking on the web. In Proceedings of the 9th USENIX conference on Networked Systems
Rosenthal, M. M., 2003. Check the Wristband, Patient Safety Network, [online] <Available
130
Rublee, E., Rabaud, V., Konolige, K. and Bradski, G., 2011, November. ORB: An efficient alter-
native to SIFT or SURF. In 2011 International conference on computer vision (pp. 2564-
2571). IEEE.
Sahu, A.P. and Bharathi, H.N., 2015. Veins based Authentication System. International Journal
Sandelowski, M., 2004. Using qualitative research. Qualitative Health Research, 14(10),
pp.1366-1386.
Sandelowski, M., Barroso, J. and Voils, C.I., 2007. Using qualitative metasummary to synthe-
size qualitative and quantitative descriptive findings. Research in nursing & health, 30(1),
pp.99-111.
2016)
learn.org/stable/modules/generated/sklearn.decomposition.MiniBatchDictionaryLearning.ht
Scikit Learn, (2014) Strategies to scale computationally: bigger data, Scikit Learn Documenta-
Soni, M., Gupta, S., Rao, M.S. and Gupta, P., 2010. A new vein pattern-based verification sys-
tem. International Journal of computer science and information security, 8(1), pp.58-63.
Suarez Pascual, J.E., Uriarte-Antonio, J., Sanchez-Reillo, R. and Lorenz, M.G., 2010, October.
Capturing hand or wrist vein images for biometric authentication using low-cost devices. In
Intelligent Information Hiding and Multimedia Signal Processing (IIH-MSP), 2010 Sixth In-
131
Sufi, F., Khalil, I. and Mahmood, A., 2011. Compressed ECG biometric: a fast, secured and effi-
cient method for identification of CVD patient. Journal of medical systems, 35(6), pp.1349-
1358.
Sutcu, Y., Sencar, H.T. and Memon, N., 2005, August. A secure biometric authentication
scheme based on robust hashing. In Proceedings of the 7th workshop on Multimedia and
Thomas, P. & Evans, C., 2004. An identity crisis? Aspects of patient misidentification. AVMA
Medical & Legal Journal, 10(1), pp.18-22.
University Hospitals Birmingham NHS Foundation Trust, 2015. About Us [online] <Available
from: http://www.uhb.nhs.uk/about-us.htm> (Accessed November 2016)
Van Der Walt, S., Schnberger, J.L., Nunez-Iglesias, J., Boulogne, F., Warner, J.D., Yager, N.,
Gouillart, E. and Yu, T., 2014. scikit-image: image processing in Python. PeerJ, 2, p.e453.
Van Rossum, G., 2007, June. Python Programming Language. In USENIX Annual Technical
Conference (Vol. 41).
Vassallo, D (2016a) Indoor GPS demo - powered by angular, pouchdb and ble beacons.. Avail-
able at: https://www.youtube.com/watch?v=7rt9hTj26ak (Accessed: 12 May 2016).
Vassallo, D(2016b) RFID powered indoor GPS. Available at: https://www.youtube.com/watch?
v=EzPnv8N_cYA (Accessed: 12 May 2016).
Vipul, A.M. and Sonpatki, P., 2016. ReactJS by Example-Building Modern Web Applications
with React. Packt Publishing Ltd.
Yalavarthy, P.K., Nundy, K.K. and Sanyal, S., 2009. Integrable Vein Viewing System in Hand
Held Devices, Indian Institute Of Science, Bangalore
Yang, Y., Zhang, J. and Kisiel, B., 2003, July. A scalability analysis of classifiers in text catego-
rization. In Proceedings of the 26th annual international ACM SIGIR conference on Re-
search and development in information retrieval (pp. 96-103). ACM.
Yao, W., Chu, C.H. and Li, Z., 2010, June. The use of RFID in healthcare: Benefits and barri-
ers. In RFID-Technology and Applications (RFID-TA), 2010 IEEE International Conference
on (pp. 128-134). IEEE.
Wang, L., Leedham, G. and Cho, S.. (2007) Infrared imaging of hand vein patterns for biomet-
ric purposes, IET Computer Vision, 1(3), pp. 113122. doi: 10.1049/iet-cvi:20070009.
Want, R., 2006. An introduction to RFID technology. Pervasive Computing, IEEE, 5(1), pp.25-
33.
Watanabe, M., Endoh, T., Shiohara, M. and Sasaki, S., 2005, September. Palm vein authentica-
tion technology and its applications. In Proceedings of the biometric consortium conference
(pp. 19-21).
Weiss, B., 2006, July. Fast median and bilateral filtering. In Acm Transactions on Graphics
(TOG) (Vol. 25, No. 3, pp. 519-526). ACM.
Weingart, S.N., Wilson, R.M., Gibberd, R.W. and Harrison, B., 2000. Epidemiology of medical
error. Western Journal of Medicine, 172(6), p.390.
132
Wilson, C. (2011) Vein pattern recognition: A privacy-enhancing Biometric. United States: CRC
Press.
World Health Organization, 2006 The World Health Report 2006, Chapter 1, Table 1.1, [online]
<Available from: http://www.who.int/whr/2006/06_chap1_en.pdf> (November 2016)
Zelkowitz, M.V. and Wallace, D.R., 1998. Experimental models for validating technology. Com-
puter, 31(5), pp.23-31.
Zimmerman, J.B., Pizer, S.M., Staab, E.V., Perry, J.R., McCartney, W. and Brenton, B.C., 1988.
An evaluation of the effectiveness of adaptive histogram equalization for contrast enhance-
ment. Medical Imaging, IEEE Transactions on, 7(4), pp.304-312.
133
APPENDICES
APPENDIX A. DS PROPOSAL
Project Title:
BioRFID: A Patient Identification System using Biometrics and RFID
Name of SSM:
134
Approval confirmed in MiTSA by the Lead Faculty (Dissertation):: (To be completed by the Lead
Faculty)
Sponsor's Details:
6PM LTD, 6PM Business Center, Triq it-Torri, Swatar, B'Kara BKR 4012, Malta, Europe
Sponsor's Background:
Healthcare IT provider with an interest in providing affordable IT healthcare solutions. Primary market is
the UK's NHS.
Sponsor's Agreement:
Yes, agreement to be posted pending some legal clarifications that have been submitted to Laureate Lens
already.
Hypothesis 1: The vein pattern biometrics significantly increases the ease and accuracy of patient
identification.
Hypothesis 2: Biometric systems can be successfully integrated with existing RFID solutions to
track patients, providing an end-to-end identification and tracking platform for patient and carer
safety
The project attempts to verify the above two hypotheses and build a system that will serve as a proof-of-
concept that showcases a fully functional patient identification and tracking system, including both hard -
ware and software system components. Current solutions currently deal with each problem separately.
RFID tracking systems are quite mature and well-established, especially in the retail sector. Biometrics is
also quickly maturing, especially with the introduction of fingerprint, voice and face recognition being
incorporated into smartphones. However, the two fields have not yet been explored in conjunction. Solu-
tions based solely on RFID still misidentify the patient and cannot guarantee the presence of a patient.
On the other hand solutions based solely on biometrics provide identification but not tracking. In addi -
tion, the previously mentioned biometric systems (fingerprint, voice recognition, face recognition) are
not particularly suited for a hospital environment since most patients might have physical or mental con-
ditions that render such biometrics ineffective. The proposed solution investigates the use of vein biomet-
rics to overcome these problems, in conjunction with RFID to provide both identification and tracking.
135
In the table below, please state your hypothesis or hypotheses; the research methods you will use to
guide the development of your IT artefact; the kind of IT artefact you will produce; and the means by
which you will evaluate the IT artefact in the light of the hypothesis.
Hypothesis
Hypothesis 1: The vein pattern biometrics significantly increases the
ease and accuracy of patient identification, assigning them the correct
ID/RFID number.
136
Software: Server to process RFID reader output
Two groups will be asked for feedback regarding how effective the system
Evaluation
is, effectively providing a complete user satisfaction survey:
Carers and hospital staff (experts)
Patients (end-users)
Project Outline
The project consists of two broad categories of tasks, those relating to the hardware of the proposed solu-
tion, and those relating to the software. Each of these categories can be further subdivided into RFID and
biometric components, as shown in Figure 1 below.
Each section consists of several steps, which at a high level can be summarized as follows:
Hardware (RFID): Obtain, configure and test RFID tags, antennas and readers.
Hardware (Biometrics): Obtain components to build a Near Infrared Camera rig that will
take pictures of subjects wrist area
Software (RFID): Write code to receive and parse RFID data from RFID readers and
translate that to a physical location
Software (Biometrics): Write image enhancing code to extract vein patterns, and machine
learning algorithms to identify which individual the vein pattern belongs to.
Quantitative methods based on statistics will be used in order to test the accuracy of the vein pattern
matching algorithms, and in order to evaluate the resulting proof of concept, we will then proceed to
qualitative methods such as issuing questionnaires to both end users (patients) and expert users (health-
care workers) to evaluate if the system helps reduce identification errors, is easy to use and helps in day-
to-day tasks.
137
Literature Survey / Resources List:
ECRI Institute, 2016. Top 10 Patient Safety Concerns for Healthcare Organizations, Executive
Brief. ECRI Institute [online] <Available from:
https://www.ecri.org/EmailResources/PSRQ/Top10/2016_Top10_ExecutiveBrief_final.pdf>
This execute brief comes from the highly respected ECRI Institute, which deals with patient safety. Com-
ing in second place, patient misidentification is acknowledged to be a very real risk in todays healthcare
environments
Chassin, M.R. and Becher, E.C., 2002. The wrong patient. Annals of Internal Medicine, 136(11),
pp.826-833.
This paper argues that patient misidentification is under-reported and medical literature does not ade-
quately discuss the problem. The author also points out that the most remediable problem in patient
misidentification is absent protocols and procedures for patient identification.
Aboalsamh, H.A., Alhashimi, H.T. and Mathkour, H.I., 2012, January. Applying Recent Vein Im-
age Enhancement Techniques In Vain Biometrics. In Proceedings of the International Conference
on Image Processing, Computer Vision, and Pattern Recognition (IPCV) (p. 1). The Steering Com-
mittee of The World Congress in Computer Science, Computer Engineering and Applied Comput-
ing (WorldComp).
This paper is useful in highlighting methods that can be used in the preprocessing stage, that is when pre-
paring a captured vein pattern image for information extraction.
Joardar, S., Chatterjee, A. and Rakshit, A., 2015. A Real-Time Palm Dorsa Subcutaneous Vein Pat-
tern Recognition System Using Collaborative Representation-Based Classification. Instrumenta-
tion and Measurement, IEEE Transactions on, 64(4), pp.959-966.
138
This paper is of significant interest since in it the author describes the workings of a low cost vein pattern
recognition system based on the micro-computer known as the Raspberry Pi - the same microcomputer
which will be used in the proposed system. Also of interest is the authors discussion of using sparse
representation based classification as a means of identifying which subject a vein pattern belongs to.
Lahtela, A., Hassinen, M. and Jylha, V., 2008, January. RFID and NFC in healthcare: Safety of
hospitals medication care. In Pervasive Computing Technologies for Healthcare, 2008. Pervasive-
Health 2008. Second International Conference on (pp. 241-244). IEEE.
This papers main premise is similar to the previous one, that RFID can help reduce medical errors.
However, this paper is of note because it also mentions NFC, a competing or complementary technology
to RFID that can also help in reducing medical errors. It would be interesting to note in the proposed sys-
tem which technologies like NFC can be used as an alternative to the RFID technology being proposed.,
and why.
Interestingly, registered nurses do not seem to think that patient misidentification is a big problem, with
only about 9% admitting to problems in a recent survey (Brtlov et al, 2015). While this is primarily an
education issue, as the authors of that same survey point out:
education, changes in protocols, and new technologies are needed to improve the precision of patient
identification. (Brtlov et al, 2015)
The proposed system will be a direct contribution to this issue. In addition, the impact of the proposed
system can be more far reaching that the traditional hospital / patient setting. For example, biometric
identification of patients has been shown to improve data used in healthcare research in Africa (Odei-
139
Lartey et al, 2016), as well as being used to further promote the use of eHealthcare systems (Kachurina et
al, 2015). The proposed system hopes to make a contribution in this respect by further exploring which
biometric techniques can be applied to help in these scenarios, as well as investigating how biometrics
can be supplemented with more traditional technology like RFID.
Last but not least, when reviewing most of the sources cited above, one notes the main focus of biomet-
rics seems to be on fingerprint and iris identification. There seems to be a lack of discussion around vein
patterns even though they have proved to be a very accurate and viable alternative. This dissertation aims
to fill this gap and identify why (or why not) vein patterns should be considered and how to integrate
them into a healthcare environment.
Evaluation Criteria:
In order to evaluate the above, the project will use a mixture of quantitative and qualitative methods. Sta-
tistical analysis and blind testing will be used to measure the accuracy of patient identification. The
project is aiming for a minimum accuracy in identification of around 90%.
In addition, a web-based UI will be used to demonstrate the ability of the system to physically track users
in a location. Last, statistical analysis of surveys will be used to evaluate the last two listed objectives
above, namely:
Survey: questionnaire results to two user groups listed below to assess if the system is ef-
fective, easy to use, and helps in healthcare day-to-day tasks.
expert users (health care workers)
140
end users (patients)
The survey will use a rating based system to gauge users experience of the system.
Resource Plan:
Hardware
Table 10: Required hardware, provider and associated costs
Part Description Provider Cost
Raspberry Pi w/ appropriate
https://www.adafruit.com/ $50
PSU & SD Card
Software
The software that is going to be used is all written from scratch, based on the Python programming lan-
guage and using open source libraries, hence there are no associated costs.
Personnel
No additional personnel will be required during the design and implementation of the system. However,
personnel in the form of end users and expert users to test and asses the system will be required at the
end of the project
141
Image Acquisition
Image Enhancement
Image classification (who does
this vein pattern belong to?)
Map vein pattern to RFID tag
Process RFID tags and display
on website
Test
Image classification accuracy
RFID tag range
End-to-End usability
Implement
Acceptance
End User Questionnaire
Expert User Questionnaire
Draft Dissertation Report
Risk Assessment:
142
before committing to a spe-
cific vein pattern recognition
algorithm
Quality Assurance
The project implementation will be split into stages that will happen sequentially. The first stage is build-
ing and testing the hardware components of the system. The second stage is building and testing the soft-
ware components of the system.
During the hardware stage, quality assurance will measure success as follows:
Once the above two stages pass QA, we will proceed to the software stage, where QA would consist of
building a testing framework. The testing framework will split the images captured in the previous stage
into training and test sets. The training image set will be used to train the AI image classification al -
gorithms, while the test set will be left to gauge the accuracy of the algorithms. Each image in the test set
will be labelled, and the output of the algorithm under test will be compared to the labels. In this way, the
accuracy of the algorithm can be determined by checking how many images were assigned the correct la-
bel. Once the framework has been finalized, QA success is determined by maximizing the accuracy of
the algorithm under test. The following approach will be taken:
143
Once the image enhancement and classification algorithms have passed QA (we expect to have at least a
90% accuracy rate), we then proceed to perform a technical end-to-end test of the system. At this stage, a
successful QA would entail the following sequence of events:
If steps 1-5 are completed successfully, then the project would have passed technical QA. The project
will now proceed to a subjective QA in which two sets of users fill in a questionnaire:
End users (subjects that are identified and tracked by the system) will describe how easy
or difficult it was to use the system, and if the system was a source of discomfort or im -
peded their experience in some form. They will also be asked if they felt confident the sys-
tem would help reduce identification errors
Expert users (users who actually operate the system, such as hospital personnel) will be
asked how easy or difficult it was to use the system, and if the system helps them in their
day-to-day tasks by easily identifying and locating patients. They will also be asked if such
a system will help reduce medical errors if integrated with other medical systems present
in the facility.
References
Brtlov, S., Hajduchov, H., Brabcov, I. and Tthov, V., 2015. Patient misidentification in nursing
care. Neuro endocrinology letters, 36(suppl2), pp.17-22.
Kachurina, P., Buccafurri, F., Bershadskaya, L., Bershadskaya, E. and Trutnev, D., 2015. Biometric Iden-
tification in eHealthcare: Learning from the Cases of Russia and Italy. In Electronic Government and the
Information Systems Perspective (pp. 103-116). Springer International Publishing.
Odei-Lartey, E.O., Boateng, D., Danso, S., Kwarteng, A., Abokyi, L., Amenga-Etego, S., Gyaase, S., As-
ante, K.P. and Owusu-Agyei, S., 2016. The application of a biometric identification technique for linking
community and hospital data in rural Ghana. Global Health Action, 9.
Thomas, P. & Evans, C., 2004. An identity crisis? Aspects of patient misidentification. AVMA Medical &
Legal Journal, 10(1), pp.18-22.
Uy, R.C.Y., Kury, F.P. and Fontelo, P.A., 2015. The State and Trends of Barcode, RFID, Biometric and
Pharmacy Automation Technologies in US Hospitals. In AMIA Annual Symposium Proceedings (Vol.
2015, p. 1242). American Medical Informatics Association.
144
APPENDIX B. USER INTERFACE SCREENSHOTS
145
Figure 47: Administrator > Map Locations Settings Page
146
Figure 49: Administrator > Enrollment > Patient Profiles
147
Figure 51: Operator > Audit Screen
148
APPENDIX C. CODE LISTING
pre_process.py
from skimage.morphology import opening
from skimage.color import rgb2gray
from skimage import data, exposure
from skimage.morphology import disk
from skimage.transform import downscale_local_mean, resize
from skimage.exposure import rescale_intensity
import matplotlib.pyplot as plt
# BEGIN FUNCTION DEFINITIONS
def load_crop_gray(image, debug=False):
image1 = data.load(image)
if debug:
plt.title('Original Image')
plt.imshow(image1, cmap=plt.cm.gray)
plt.show()
if debug:
plt.title('Cropped Image')
plt.imshow(image1, cmap=plt.cm.gray)
plt.show()
image1 = rgb2gray(image1)
if debug:
plt.title('Grayscale Image')
plt.imshow(image1, cmap=plt.cm.gray)
plt.show()
image1 = downscale_local_mean(image1, (25, 25))
image1 = rescale_intensity(image1)
if debug:
plt.title('Downscaled Image')
plt.imshow(image1, cmap=plt.cm.gray)
plt.show()
selem = disk(4)
opened = opening(image1, selem)
image1 = rescale_intensity(opened)
if debug:
plt.title('Feature Reconstruction / Enhancement')
plt.imshow(image1, cmap=plt.cm.gray)
plt.show()
return image1
def pre_process_dave (image, debug=False):
image1 = load_crop_gray(image, debug)
img_eq1 = exposure.equalize_adapthist(image1)#, clip_limit=0.3)
img_eq1 = rescale_intensity(img_eq1)
if debug:
plt.title('Adaptive Histogram Equalization')
plt.imshow(img_eq1, cmap=plt.cm.gray)
plt.show()
return img_eq1
149
webserver.py
from flask import request, session
from flask import render_template
from flask import Flask
from werkzeug.utils import secure_filename
import numpy as np
from numpy.random import RandomState
from sklearn.decomposition import MiniBatchDictionaryLearning
from sklearn.neighbors import DistanceMetric
import sqlite3, json
import glob, os
import cPickle as pickle
import sys
from pre_process import pre_process_dave
app = Flask(__name__)
# set the secret key. keep this really secret:
app.secret_key = 'lbfsO20498U9WE08HJFD89EWQTFCDHUKJASHDFAO87Glkgads'
UPLOAD_FOLDER = '/tmp'
ALLOWED_EXTENSIONS = set(['jpg', 'jpeg', 'png'])
app.config['UPLOAD_FOLDER'] = UPLOAD_FOLDER
############### BEGIN UTILS SECTION ##############
def allowed_file(filename):
print filename
return '.' in filename and \
filename.rsplit('.', 1)[1] in ALLOWED_EXTENSIONS
def getPatientIdentifier(filename):
patientIdentifier = os.path.basename(filename).split('_')[0]
patientIdentifier = patientIdentifier.split('.')[0]
return patientIdentifier
def getPatientNumericalLabel(patientIdentifier):
try:
patientIdentifier = int(patientIdentifier)
except:
patientIdentifier=''.join([str(ord(c)) for c in patientIdentifier])
return patientIdentifier
############### BEGIN DB SECTION ###############
def connectToDB(dictionary=False):
conn = sqlite3.connect('bioRFID.db')
if dictionary:
conn.row_factory = sqlite3.Row
return conn
def createTablesDB():
conn = connectToDB()
# create rfid readers table
conn.execute('''CREATE TABLE IF NOT EXISTS rfidReaders
(ID INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
ipAddress TEXT NOT NULL);''')
print "rfidReaders table created successfully"
# create rfid antennas to location mapping table
conn.execute('''CREATE TABLE IF NOT EXISTS rfidAntennas
(ID INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
readerID INTEGER NOT NULL,
antennaID INTEGER NOT NULL,
locationName TEXT NOT NULL);''')
print "rfidAntennas table created successfully"
# create patient table
conn.execute('''CREATE TABLE IF NOT EXISTS patients
(ID INTEGER PRIMARY KEY AUTOINCREMENT,
name TEXT NOT NULL,
rfidCode TEXT NOT NULL,
sparseCode TEXT NOT NULL);''')
print "patients table created successfully"
# create rfid tag reads table
# data should be in the form:
# reader_ip,event.tag.arrive tag_id={},antenna={},rssi={},timestamp
150
conn.execute('''CREATE TABLE IF NOT EXISTS rfidTagReads
(ID INTEGER PRIMARY KEY AUTOINCREMENT NOT NULL,
timestamp DATETIME NOT NULL,
readerID INTEGER NOT NULL,
antennaID INTEGER NOT NULL,
tagID TEXT NOT NULL);''')
print "rfidTagReads table created successfully"
conn.commit()
conn.close()
def getLocations(readerID=None):
conn = connectToDB()
if readerID:
cursor = conn.execute("SELECT ipAddress, locationName, antennaID FROM rfi-
dAntennas INNER JOIN rfidReaders ON "
"rfidReaders.ID = readerID WHERE readerID="+readerID+";")
else:
cursor = conn.execute("SELECT ipAddress, locationName, antennaID FROM rfi-
dAntennas INNER JOIN rfidReaders ON "
"rfidReaders.ID = readerID;")
result = cursor.fetchall()
conn.commit()
conn.close()
return result
def getReaders():
conn = connectToDB()
cursor = conn.execute("SELECT ID, ipAddress FROM rfidReaders;")
result = cursor.fetchall()
conn.commit()
conn.close()
return result
def getPatientProfiles():
conn = connectToDB()
cursor = conn.execute("SELECT name, rfidCode FROM patients;")
result = cursor.fetchall()
conn.commit()
conn.close()
return result
def getPatientBiometrics():
conn = connectToDB()
cursor = conn.execute("SELECT name, sparseCode FROM patients;")
result = cursor.fetchall()
conn.commit()
conn.close()
return result
def filterReadsByLocation(location):
conn = connectToDB()
cursor = conn.execute("SELECT name, timestamp,tagID, locationName FROM rfid-
TagReads INNER JOIN "
"patients ON patients.rfidCode = tagID INNER JOIN "
"rfidAntennas ON rfidTagReads.readerID = rfidAntennas.readerID
AND "
"rfidTagReads.antennaID = rfidAntennas.antennaID WHERE "
"locationName='"+location+"' ORDER BY date(timestamp) DESC LIMIT
1;")
result = cursor.fetchall()
conn.commit()
conn.close()
return result
def getAllReads():
conn = connectToDB()
cursor = conn.execute("SELECT name, timestamp,tagID, locationName FROM rfid-
TagReads INNER JOIN "
"patients ON patients.rfidCode = tagID INNER JOIN "
"rfidAntennas ON rfidTagReads.readerID = rfidAntennas.readerID "
"AND rfidTagReads.antennaID = rfidAntennas.antennaID;")
result = cursor.fetchall()
151
conn.commit()
conn.close()
return result
def lastKnownLocation(patientName):
conn = connectToDB()
cursor = conn.execute("SELECT name, timestamp,tagID, locationName FROM rfid-
TagReads INNER JOIN "
"patients ON patients.rfidCode = tagID INNER JOIN "
"rfidAntennas ON rfidTagReads.readerID = rfidAntennas.readerID "
"AND rfidTagReads.antennaID = rfidAntennas.antennaID WHERE
name='" + patientName + "' "
"ORDER BY date(timestamp) DESC LIMIT 1;")
result = cursor.fetchone()
conn.commit()
conn.close()
return result
def filterReadsByTag(tagID):
conn = connectToDB()
cursor = conn.execute("SELECT name, timestamp,tagID, locationName FROM rfid-
TagReads INNER JOIN "
"patients ON patients.rfidCode = tagID INNER JOIN "
"rfidAntennas ON rfidTagReads.readerID = rfidAntennas.readerID "
"AND rfidTagReads.antennaID = rfidAntennas.antennaID WHERE
tagID='"+tagID+"';")
result = cursor.fetchall()
conn.commit()
conn.close()
return result
def insertPatientDB(name, rfidCode, sparseCode = '×'):
conn = connectToDB()
conn.execute("INSERT INTO patients (name, rfidCode, sparseCode) \
VALUES ('"+name+"', '"+rfidCode+"', '"+sparseCode+"' );")
conn.commit()
conn.close()
def deletePatientDB(name):
conn = connectToDB()
conn.execute("DELETE FROM patients WHERE name='"+name+"';")
conn.commit()
conn.close()
def updatePatientBioDB(name, sparseCode):
conn = connectToDB()
cursor = conn.execute("SELECT name FROM patients WHERE name = '" + name +
"';")
if cursor.fetchone():
conn.execute("UPDATE patients set sparseCode='"+sparseCode+"' WHERE
name='"+name+"';")
else:
conn.execute("INSERT INTO patients (name, rfidCode, sparseCode) "
"VALUES ('" + name+ "', '', '" + sparseCode + "');")
conn.commit()
conn.close()
def updatePatientRfidDB(name, rfidCode):
conn = connectToDB()
conn.execute("UPDATE patients set rfidCode='"+rfidCode+"' WHERE
name='"+name+"';")
conn.commit()
conn.close()
def insertRfidReaderDB(ipAddress):
conn = connectToDB()
conn.execute("INSERT INTO rfidReaders (ipAddress) \
VALUES ('"+ipAddress+"' )")
conn.commit()
conn.close()
def deleteRfidReaderDB(ipAddress):
conn = connectToDB()
152
cursor = conn.execute("SELECT ID FROM rfidReaders WHERE ipAddress='"+ipAd-
dress+"';")
result = cursor.fetchone()
if result is not None:
readerID = result[0]
conn.execute("DELETE FROM rfidAntennas WHERE readerID='"+readerID+"';")
conn.execute("DELETE FROM rfidReaders WHERE ipAddress='"+ipAddress+"';")
conn.commit()
conn.close()
def insertRfidAntennaDB(ipAddress, antennaID, locationName):
conn=connectToDB()
cursor = conn.execute("SELECT ID from rfidReaders WHERE ipAddress='"+ipAd-
dress+"';")
result = cursor.fetchone()
rfidReaderId = result[0]
conn.execute("INSERT INTO rfidAntennas (readerID, antennaID, locationName) \
VALUES ("+str(rfidReaderId)+","+str(antennaID)+",'"+str(locationName)
+"' )");
conn.commit()
conn.close()
def deleteRfidAntennaDB(locationName=None, ipAddress=None, antennaID=None):
conn=connectToDB()
if locationName:
conn.execute("DELETE from rfidAntennas WHERE locationName='"+location-
Name+"';")
elif ipAddress and antennaID:
cursor = conn.execute("SELECT ID from rfidReaders WHERE ipAddress='"+ipAd-
dress+"';")
result = cursor.fetchone()
rfidReaderId = result[0]
conn.execute("DELETE from rfidAntennas WHERE readerID="+rfidReaderId+" AND
antennaID="+antennaID+";")
conn.commit()
conn.close()
############### END DB SECTION ###############
############### BEGIN SPARSE CODE SECTION ###############
rng = RandomState(0)
def buildSparseDict(trainingDir =
'/home/dvas0004/Pictures/chime/NEW_RIG/train'):
numberOfTrainingFiles = 0
trainingDir = trainingDir.rstrip('/')
#build sparse code dictionary
print "building dictionary data"
dictionary_data = {}
training_images = glob.glob(trainingDir+'/*.jpg')
biggestImageSize = 0
for training_image in training_images:
print 'Processing: '+training_image
numberOfTrainingFiles += 1
threshold_image = pre_process_dave(training_image, debug=False)
orig_data = np.reshape(threshold_image,(1, -1)).astype(float)
mean = np.mean(orig_data)
std = np.std(orig_data)
data = orig_data
data -= mean
data /= std
imageSize=np.size(data)
if imageSize > biggestImageSize:
biggestImageSize = imageSize
patientIdentifier = getPatientIdentifier(training_image)
try:
dictionary_data[patientIdentifier].append(data)
except KeyError:
dictionary_data[patientIdentifier] = [data]
# define mini batch dictionary
153
mbdl = MiniBatchDictionaryLearning(n_components=30, transform_n_nonzero_co-
efs = 30, transform_alpha=0.01, alpha=0.001,
n_iter=50, batch_size=15,
random_state=100, shuffle=True,
split_sign=False, n_jobs=-1, transform_algorithm='lars')
# fit sparse code dictionary
# fit data to array
mdbl_data= np.zeros((numberOfTrainingFiles, biggestImageSize))
counter = 0
for patient in dictionary_data:
for data in dictionary_data[patient]:
mdbl_data[counter] = data
counter += 1
print "fitting sparse code dictionary using "+str(numberOfTrainingFiles)+"
training images..."
mbdl.fit(mdbl_data)
print "saving dictionary..."
pickle.dump(mbdl, open("sparse_dict.p", "wb"))
def getImageSparseCode(imageFilename):
# build sparse codes for training set, based on the saved dictionary
print "building sparse codes for template image..."
print 'Processing: '+imageFilename
threshold_image = pre_process_dave(imageFilename, debug=False)
orig_data = np.reshape(threshold_image,(1, -1)).astype(float)
mean = np.mean(orig_data)
std = np.std(orig_data)
data = orig_data
data -= mean
data /= std
#load dictionary
mbdl = pickle.load(open("sparse_dict.p", "rb"))
templateSpareCode = mbdl.transform(data)
return templateSpareCode
def initTemplateSparseStore(trainingDir =
'/home/dvas0004/Pictures/chime/NEW_RIG/train',debug=False):
trainingDir = trainingDir.rstrip('/')
#build template sparse code store
print "building template sparse code store"
templateSparseCodeStore = {}
if debug:
import matplotlib.pyplot as plt
pca_data=[]
pca_labels=[]
subjectsDone=[]
training_images = glob.glob(trainingDir+'/*.jpg')
for training_image in training_images:
print 'Processing: '+training_image
patientIdentifier = getPatientIdentifier(training_image)
templateSparseCode = getImageSparseCode(training_image)
try:
templateSparseCodeStore[patientIdentifier].append(templateSparseCode)
except KeyError:
templateSparseCodeStore[patientIdentifier] = [templateSparseCode]
if debug:
#if patientIdentifier not in subjectsDone:
pca_data.append(templateSparseCode[0])
pca_labels.append(patientIdentifier)
subjectsDone.append(patientIdentifier)
updatePatientBioDB(patientIdentifier, '✔')
if debug:
from sklearn.decomposition import PCA
pca = PCA(n_components=2)
pca_data = pca.fit_transform(pca_data)
dist = DistanceMetric.get_metric('euclidean')
print '***************'
print np.mean(dist.pairwise(pca_data))
154
counter=0
x_d1=[]
x_d2=[]
x_d3=[]
y_d1=[]
y_d2=[]
y_d3=[]
d1 = []
d2 = []
d3 = []
for dp in pca_data:
if pca_labels[counter].endswith('Right'):
d1.append([dp[0],dp[1]])
x_d1.append(dp[0])
y_d1.append(dp[1])
else:
d2.append([dp[0],dp[1]])
x_d2.append(dp[0])
y_d2.append(dp[1])
plt.annotate(
pca_labels[counter],
xy = (dp[0], dp[1]), xytext = (-20, 20),
textcoords = 'offset points', ha = 'right', va = 'bottom',
bbox = dict(boxstyle = 'round,pad=0.5', fc = 'yellow', alpha = 0.5),
arrowprops = dict(arrowstyle = '->', connectionstyle = 'arc3,rad=0'))
counter += 1
plt.plot(x_d1, y_d1, 'ro', x_d2, y_d2, 'bo')#, x_d3, y_d3, 'go')
plt.show()
print "saving template sparse code store..."
pickle.dump(templateSparseCodeStore, open("template_sparse_store.p", "wb"))
def addTemplateCodeToStore(templateFilename, recordToDB=False):
templateSparseCode = getImageSparseCode(templateFilename)
patientIdentifier = getPatientIdentifier(templateFilename)
#load template sparse code store
templateSparseCodeStore = pickle.load(open("template_sparse_store.p", "rb"))
try:
templateSparseCodeStore[patientIdentifier].append(templateSparseCode)
except KeyError:
templateSparseCodeStore[patientIdentifier] = [templateSparseCode]
print "saving updated template sparse code store..."
pickle.dump(templateSparseCodeStore, open("template_sparse_store.p", "wb"))
if recordToDB:
updatePatientBioDB(patientIdentifier,'✔')
def removePatientTemplateCodes(patientIdentifier):
#load template sparse code store
templateSparseCodeStore = pickle.load(open("template_sparse_store.p", "rb"))
try:
del templateSparseCodeStore[patientIdentifier]
except KeyError:
print 'Patient Sparse Codes not present in template store'
print "saving updated template sparse code store..."
pickle.dump(templateSparseCodeStore, open("template_sparse_store.p", "wb"))
############### END SPARSE CODE SECTION ###############
############### BEGIN COMPARISON SECTION ###############
def eucledianDistComparison(sparseCode):
dist = DistanceMetric.get_metric('euclidean')
currentBestDistance = None
guess1 = None
guess2 = None
currentBestGuess = None
#loadTemplate Sparse Code Store
templateSparseCodeStore = pickle.load(open("template_sparse_store.p", "rb"))
for patient in templateSparseCodeStore:
for templateSparseCode in templateSparseCodeStore[patient]:
euc_dist = dist.pairwise(sparseCode.tolist(), templateSparseCode.-
tolist())
155
if currentBestDistance is None:
currentBestDistance = euc_dist
currentBestGuess = patient
else:
if euc_dist < currentBestDistance:
currentBestDistance = euc_dist
guess2 = guess1
guess1 = currentBestGuess
currentBestGuess = patient
guesses=currentBestGuess
bestVote = 0
votedGuess = None
for guess in guesses:
vote = guesses.count(guess)
if vote > bestVote:
bestVote = vote
votedGuess = guess
print "{}/{}/{}".format(currentBestGuess,guess1,guess2)
return guesses
############### END COMPARISON SECTION ###############
############### BEGIN IDENTIFICATION SECTION ###############
def identifyPatient(imageFilename):
subjectSparseCode = getImageSparseCode(imageFilename)
bestGuessID = eucledianDistComparison(subjectSparseCode)
return bestGuessID
############### END IDENTIFICATION SECTION ###############
############### BEGIN WEB API SECTION ###############
@app.route('/identifyPatient', methods=['POST'])
def flaskIdentifyPatient():
if request.method == 'POST':
# check if the post request has the file part
if 'file' not in request.files:
return 'No files present in request'
file = request.files['file']
# if user does not select file, browser also
# submit a empty part without filename
if file.filename == '':
return 'No selected file'
if file and allowed_file(file.filename):
filename = secure_filename(file.filename)
templateAbsoluteLocation = os.path.join(app.config['UPLOAD_FOLDER'],
filename)
file.save(templateAbsoluteLocation)
bestGuessID = identifyPatient(templateAbsoluteLocation)
return bestGuessID
else:
return 'File not in allowed Extensions'
@app.route('/uploadPatientTemplate', methods=['POST'])
def uploadPatientTemplate():
if request.method == 'POST':
# check if the post request has the file part
if 'file' not in request.files:
return 'No files present in request'
file = request.files['file']
# if user does not select file, browser also
# submit a empty part without filename
if file.filename == '':
return 'No selected file'
if file and allowed_file(file.filename):
filename = secure_filename(file.filename)
templateAbsoluteLocation = os.path.join(app.config['UPLOAD_FOLDER'],
filename)
file.save(templateAbsoluteLocation)
addTemplateCodeToStore(templateAbsoluteLocation)
return 'OK'
else:
156
return 'File not in allowed Extensions'
@app.route('/setUserType', methods=['POST'])
def setUserType():
userType = request.form['userType']
session['userType'] = userType
return 'OK'
@app.route('/addRfidReader', methods=['POST'])
def addRfidReader():
ipAddress = request.form['ipAddress']
insertRfidReaderDB(ipAddress)
return 'OK'
@app.route('/delRfidReader', methods=['POST'])
def delRfidReader():
ipAddress = request.form['ipAddress']
deleteRfidReaderDB(ipAddress)
return 'OK'
@app.route('/addLocation', methods=['POST'])
def addLocation():
locationName = request.form['location']
ipAddress = request.form['ipAddress']
antennaID = request.form['antennaID']
insertRfidAntennaDB(ipAddress,antennaID,locationName)
return 'OK'
@app.route('/addPatientProfile', methods=['POST'])
def addPatientProfile():
patientName = request.form['name']
rfidCode = request.form['rfidCode']
insertPatientDB(patientName,rfidCode)
return 'OK'
@app.route('/updatePatientRfid', methods=['POST'])
def flaskUpdatePatientRfid():
patientName = request.form['patient']
newRFID = request.form['rfid']
updatePatientRfidDB(patientName, newRFID)
return 'OK'
@app.route('/delLocation', methods=['POST'])
def delLocation():
locationName = request.form['location']
deleteRfidAntennaDB(locationName=locationName)
return 'OK'
@app.route('/delPatientProfile', methods=['POST'])
def delPatientProfile():
patientName = request.form['patient']
deletePatientDB(patientName)
removePatientTemplateCodes(patientName)
return 'OK'
@app.route('/getReaders', methods=['POST'])
def flaskGetReaders():
results = getReaders()
resultArray = []
for result in results:
resultArray.append({'id':str(result[0]),'ip':str(result[1])})
return json.dumps(resultArray)
@app.route('/getLocations', methods=['POST'])
def flaskGetLocations():
results = getLocations()
resultArray = []
for result in results:
resultArray.append({'ip':str(result[0]),'location':str(result[1]),'an-
tenna':str(result[2])})
return json.dumps(resultArray)
@app.route('/getPatientProfiles', methods=['POST'])
def flaskGetPatientProfiles():
results = getPatientProfiles()
resultArray = []
for result in results:
157
resultArray.append({'name':str(result[0]),'rfid':str(result[1])})
return json.dumps(resultArray)
@app.route('/getPatientBiometrics', methods=['POST'])
def flaskgetPatientBiometrics():
results = getPatientBiometrics()
resultArray = []
for result in results:
resultArray.append({'name':str(result[0]),'sparse':str(result[1])})
return json.dumps(resultArray)
@app.route('/filterLocation', methods=['POST'])
def filterLocation():
locationName = request.form['location']
result = filterReadsByLocation(locationName)
return json.dumps(result)
@app.route('/getAllRecords', methods=['POST'])
def flaskGetAllRecords():
result = getAllReads()
return json.dumps(result)
@app.route('/filterTag', methods=['POST'])
def filterTag():
tagID = request.form['tagID']
result = filterReadsByTag(tagID)
return json.dumps(result)
@app.route('/lastKnown', methods=['POST'])
def lastKnown():
name = request.form['name']
result = lastKnownLocation(name)
return json.dumps(result)
@app.route('/buildSparseDict', methods=['POST'])
def flaskBuildSparseDict():
trainingFolder = request.form['trainingFolder']
buildSparseDict(trainingDir=trainingFolder)
initTemplateSparseStore(trainingDir=trainingFolder)
return 'OK'
@app.route('/echoer', methods=['POST'])
def echoer():
print request.form
return str(request.form)
############### END WEB API SECTION ###############
############### BEGIN WEB FRONTEND SECTION ###############
@app.route('/')
def login():
return render_template('login.html')
@app.route('/admin')
def adminPage():
return render_template('admin.html')
@app.route('/operator')
def operatorPage():
return render_template('operator.html')
############### END WEB FRONTEND SECTION ###############
############### BEGIN TESTING SECTION ###############
def buildConfusionMatrix():
#clear previous runs
try:
os.remove("sparse_dict.p")
os.remove("template_sparse_store.p")
except:
pass
#start by building sparse dictionary
print "Building Sparse Dictionary"
trainingFolder = '/home/dvas0004/Dropbox/Masters/Dissertation/lensPics/sam-
ples/train/'
buildSparseDict(trainingDir=trainingFolder)
initTemplateSparseStore(trainingDir=trainingFolder, debug=False)
# print "Adding Templates"
158
# templatesFolder =
'/home/dvas0004/Dropbox/Masters/Dissertation/Pcitures/train/templates'
# template_images = glob.glob(templatesFolder+'/*.jpg')
# for template_image in template_images:
# addTemplateCodeToStore(template_image)
#start identifying patients
print "Starting Testing"
testingFolder = '/home/dvas0004/Dropbox/Masters/Dissertation/lensPics/sam-
ples/test/'
testing_images = glob.glob(testingFolder+'/*.jpg')
totalNumberTested = 0
correctlyIdentified = 0
incorrectGuesses = []
for testing_image in testing_images:
patientIdentifier = getPatientIdentifier(testing_image)
totalNumberTested += 1
patientGuess = identifyPatient(testing_image)
print "Patient Guess: {}".format(patientGuess)
if patientIdentifier == patientGuess:
correctlyIdentified += 1
else:
incorrectGuesses.append(patientIdentifier)
percentageCorrect = (float(correctlyIdentified)/totalNumberTested)*100
incorrectlyIdentified = totalNumberTested - correctlyIdentified
percentageIncorrect = (float(incorrectlyIdentified)/totalNumberTested)*100
print "--------------------------------------------------"
print "Results: "
print "--------------------------------------------------"
print "Total number of testing images: {}".format(totalNum-
berTested)
print "Correctly identified images: {} / {}%".format(correctlyI-
dentified, percentageCorrect)
print "Incorrectly identified images: {} / {}%".format(incorrect-
lyIdentified, percentageIncorrect)
print incorrectGuesses
print "--------------------------------------------------"
############### END TESTING SECTION ###############
createTablesDB()
try:
if sys.argv[1] == "testing":
buildConfusionMatrix()
exit(0)
else:
app.run(host='0.0.0.0', port=5001, debug=True)
except IndexError:
app.run(host='0.0.0.0', port=5001, debug=True)
159
raspiClient.py
# setup:
# pip install requests
# pip install
import sys
import RPi.GPIO as GPIO
from picamera import PiCamera
import requests
from PyQt4 import QtGui, QtCore
from PyQt4.QtGui import *
class veinCamera(object):
def __init__(self, patientID, URL):
self.camera = PiCamera()
self.patientID = patientID
self.URL = URL
self.veinPhoto = ''
GPIO.setmode(GPIO.BCM)
GPIO.setup(4, GPIO.IN)
def takePicture(self):
self.veinPhoto = str('/home/pi/Desktop/' + self.patientID + '.jpg')
picOutput = open(self.veinPhoto, 'wb')
print self.veinPhoto
self.camera.start_preview()
while GPIO.input(4) != 0:
pass
self.camera.capture(self.veinPhoto)
self.camera.stop_preview()
picOutput.close()
self.camera.close()
def postPicture(self):
files = {'file': open(self.veinPhoto, 'rb')}
r = requests.post(self.URL, files=files)
return r.text
class Initial_Window(QtGui.QWidget):
def __init__(self):
QtGui.QWidget.__init__(self)
self.button_id = QtGui.QPushButton('Identify', self)
self.button_enroll = QtGui.QPushButton('Enroll', self)
self.button_id.clicked.connect(self.handleButton_id)
self.button_enroll.clicked.connect(self.handleButton_enroll)
layout = QtGui.QVBoxLayout(self)
layout.addWidget(self.button_id)
layout.addWidget(self.button_enroll)
self.setWindowTitle('BioRFID')
self.resize(320,240)
self.patientID = ''
def handleButton_id(self):
url = 'http://192.168.2.233:5001/identifyPatient'
vc = veinCamera('unknown', url)
vc.takePicture()
bestGuess = vc.postPicture()
msg = QMessageBox()
msg.setIcon(QMessageBox.Information)
msg.setText("Patient Best Guess:")
msg.setInformativeText('<strong>'+bestGuess+'</strong>')
msg.setWindowTitle("BioRFID")
if (bestGuess=='s1'):
msg.setIconPixmap(QPixmap("/home/pi/Desktop/1cf11d7.jpg"))
else:
msg.setIconPixmap(QPixmap("/home/pi/Desktop/1cf11d6.jpg"))
msg.exec_()
def handleButton_enroll(self):
print self.patientID
input = QInputDialog()
160
input.setTextValue(self.patientID)
text, ok = input.getText(self, 'BioRFID Patient Enroll', 'Enter patient
name:', text=self.patientID)
if ok:
print text
self.patientID = text
url = 'http://192.168.2.233:5001/uploadPatientTemplate'
vc = veinCamera(self.patientID,url)
vc.takePicture()
result = vc.postPicture()
msg = QMessageBox()
msg.setIcon(QMessageBox.Information)
msg.setText("Patient Submitted")
msg.setInformativeText(result)
msg.setWindowTitle("BioRFID")
msg.exec_()
app = QtGui.QApplication(sys.argv)
window = Initial_Window()
window.show()
sys.exit(app.exec_())
161