Escolar Documentos
Profissional Documentos
Cultura Documentos
B.TECH
by
PRINCE KADIWAR(14BIT0089)
April, 2018
DECLARATION
I hereby declare that the project report entitled “FACE RECONITION BASED CLASS
ATTENDANCE” submitted by me, for the award of the degree of B.Tech to VIT is a record of
bonafide work carried out by me under the supervision of DR. SWARNA PRIYA R. M. I further
declare that the work reported in this project has not been submitted and will not be submitted,
either in part or in full, for the award of any other degree or diploma in this institute or any other
institute or university.
Place: Vellore
This is to certify that the Project Report entitled “FACE RECONITION BASED CLASS
ATTENDANCE” submitted by PRINCE KADIWAR(14BIT0089) SITE, VIT, Vellore for the
award of the degree of B.TECH is a record of bonafide work carried out by him under my
supervision.
The contents of this report have not been submitted and will not be submitted either in part or in
full, for the award of any other degree or diploma in this institute or any other institute or
university. The Project report fulfills the requirements and regulations of VIT and in my opinion
meets the necessary standards for submission.
This project mainly addresses the building of face recognition system by using Principal
Component Analysis (PCA). PCA is a statistical approach used for reducing the number of
variables in face recognition. In PCA, every image in the training set is represented as a linear
combination of weighted eigenvectors called eigenfaces. These eigenvectors are obtained from
covariance matrix of a training image set. The weights are found out after selecting a set of most
relevant Eigenfaces. Recognition is performed by projecting a test image onto the subspace
spanned by the eigenfaces and then classification is done by measuring minimum Euclidean
distance. A number of experiments were done to evaluate the performance of the face recognition
system based Attendance management.
i
ACKNOWLEDGEMENT
It is my pleasure to express with deep sense of gratitude to DR. SWARNA PRIYA R M and DR.
PARIMALAM M., School of Information Technology and Engineering, Vellore Institute of
Technology, for their constant guidance, continual encouragement, understanding, more than all,
they taught me patience in my endeavor. My association with them is not confined to academics
only, but it is a great opportunity on my part of work with an intellectual and expert in the field of
Machine learning.
I would like to express my gratitude to the chancellor DR. G. VISWANATHAN, the vice
presidents DR. SANKAR VISWANATHAN, DR. SEKAR VISWANATHAN, MR. G. V.
SELVAM, the vice chancellor DR. ANAND A. SAMUEL, the pro vice chancellor DR. S.
NARAYANAN and DR. ASWANI KUMAR CHERUKURI, School of Information Technology
and Engineering, for providing with an environment to work in and for their inspiration during the
tenure of the course.
In jubilant mood I express ingeniously my whole-hearted thanks to Dr. DINAKARAN M. Head
of Department of Information Technology, SITE, all teaching staff and members working as limbs
of our university for their not self-centered enthusiasm coupled with timely encouragements
showered on me with zeal, which prompted the Acquirement of the requisite knowledge to finalize
my course study successfully.
I would like to thank my parents for their support. It is indeed a pleasure to thank my friends who
persuaded and encouraged me to Take up and complete this task. At last but not least, I express
my gratitude and Appreciation to all those who have helped me directly or indirectly toward the
Successful completion of this project. It is indeed a pleasure to thank my friends who persuaded and
encouraged me to take up and complete this task. At last but not least, I express my gratitude and
appreciation to all those who have helped me directly or indirectly toward the successful completion of
this project.
Place: Vellore
ii
CONTENTS
CONTENTS..............................................................................................................iii
CHAPTER 1
INTRODUCTION
1.1 CHALLENGES AND OBJECTIVES .................................................................2
CHAPTER 2
LITERATURE REVIEW
CHAPTER 3
iii
CHAPTER 4
CHAPTER 5
iv
5.4 WORKING FORMATS IN MATLAB ............................................................... 20
CHAPTER 6
CHAPTER 7
SYSTEM DESIGN
v
CHAPTER 8
IMPLEMENTATION
8.2 SCREENSHOTS.................................................................................................. 49
CHAPTER 9
TESTING STRATEGY
CHAPTER 10
CONCLUSION
REFERENCES .........................................................................................................58
vi
LIST OF FIGURES
vii
LIST OF TABLES
4.1 OUTPUT FORMAT ............................................................................................ 14
viii
CHAPTER 1
Introduction
1
1.1 CHALLENGES AND OBJECTIVES
The most creative and challenging phase of the system development is system design. It provides
the under standing and procedural details necessary for implementing the system recommended in
the feasibility study. Design goes through the logical and physical stages of development.
In designing a new system, the system analyst must have a clear understanding of the objectivies,
which the design is aimed to fulfill. The first step is to determine how the output is to be designed
to meet the requirements of the desired output. The operational phases are handled through
program construction and testing. Finally details related to justification of the system on the user
and the organizations documented and evaluated by the management.
The final prior to the implementation phase includes procedural flowcharts, records and the report
layout and workable plan for implementing the candidate system.
2
CHAPTER 2
Literature Review
Year: 2013
Authors have proposed a Automated Attendance Management System based On Face Recognition
Algorithm. This system, which is based on face detection and recognition algorithm, automatically
detects the student when he enters the class room and mark the attendance by recognition him.
This technique is to be use in order to handle the treats like spoofing. The problem with this
approach is that it capture only one student image at a time when he enter the classroom thus it is
time consuming and may distract the attention of the student.
Year: 2010
Automatic Control of Students Attendance in Classroom Using RFID. In this system in which
students carry a RFID tag type ID card and they need to place that on the card reader to record his
attendance. RS232 is used to connect the system with computer and stored the recorded attendance
from the database. This system may give rise to the problem of fraudulent approach. An
unauthorized students may make use of RFID card and enter into the organization.
Year: 2011
3
Wireless Fingerprint Attendance Management System. This system uses iris recognition system
that does capturing the image of iris recognition, extraction, storing and matching. But the
difficulty occurs to lay the transmission lines in the places where the quality of topography is poor.
In [4] authors have consider a system based on real time face recognition which is reliable, secure
and fast which needs improvement in different lighting conditions.
Title: Attendance System Using Face Recognition and Class Monitoring System
Year: 2014
Author: Prof. Arun Katara, Mr. Sudesh V. Kolhe , Mr. Amar P. Zilpe , Mr. Nikhil D. Bhele4 ,
Mr. Chetan J. Bele
In this paper, we propose a system that takes the attendance of students in the lecture. This system
takes the attendance automatically using face recognition. However, it is difficult to estimate the
attendance exactly using each result of face recognition independently because the face detection
rate is not sufficiently high. In our paper, we propose a method for estimating the attendance
exactly using all the results of face recognition obtained by continuous observation. Continuous
observation improves the performance for the estimation of the attendance. We constructed the
attendance system based on face recognition, and applied the system to classroom lecture. In our
system, we are using raspberry pi. we use OpenCv library which is installed in pi for face detection
and recognition. The camera is connected to raspberry pi and student database is stored in the pi.
With the help of this system time will reduce and attendance will be marked. In this paper first
review similar works in the field of attendance system and recognition of face. Then, it showing
our system structure and plan. At the last, experiments are implemented to provide as manifest to
support our plan. The result shows that uninterrupted observation improved the performance for
the approximation of the attendance.
Year: 2015
Author: Hteik Htar Lwin, Aung Soe Khaing, Hla Myo Tun
4
Most doors are controlled by persons with the use of keys, security cards, password or pattern to
open the door. Theaim of this paper is to help users forimprovement of the door security of
sensitive locations by using face detection and recognition. Face is a complex multidimensional
structure and needs good computing techniques for detection and recognition. This paper is
comprised mainly of three subsystems: namely face detection, face recognition and automatic
door access control. Face detection is the process of detecting the region of face in an image. The
face is detected by using the viola jones method and face recognition is implemented by using
the Principal Component Analysis (PCA). Face Recognition based on PCA is generally referred
to as the use of Eigenfaces.If a face is recognized, it is known, else it is unknown. The door will
open automatically for the known person due to the command of the microcontroller. On the
other hand, alarm will ring for the unknown person. Since PCA reduces the dimensions of face
images without losing important features, facial images for many persons can be stored in the
database. Although many training images are used, computational efficiency cannot be decreased
significantly. Therefore, face recognition using PCA can be more useful for door security system
than other face recognition schemes.
5
CHAPTER 3
3.1 INTRODUCTION
Digital image processing refers processing of the image in digital form. Modern cameras
may directly take the image in digital form but generally images are originated in optical form.
They are captured by video cameras and digitalized. The digitalization process includes sampling,
quantization. Then these images are processed by the five fundamental processes, at least any one
of them, not necessarily all of them.
Image Enhancement
Image Restoration
Image Analysis
IP
Image Compression
Image Synthesis
6
3.2.1 IMAGE ENHANCEMENT
Image enhancement operations improve the qualities of an image like improving the
image’s contrast and brightness characteristics, reducing its noise content, or sharpen the details.
This just enhances the image and reveals the same information in more understandable image. It
does not add any information to it.
Image restoration like enhancement improves the qualities of image but all the operations
are mainly based on known, measured, or degradations of the original image. Image restorations
are used to restore images with problems such as geometric distortion, improper focus, repetitive
noise, and camera motion. It is used to correct images for known degradations.
Image compression and decompression reduce the data content necessary to describe the
image. Most of the images contain lot of redundant information, compression removes all the
redundancies. Because of the compression the size is reduced, so efficiently stored or transported.
The compressed image is decompressed when displayed. Lossless compression preserves the exact
data in the original image, but Lossy compression does not represent the original image but provide
excellent compression.
Image synthesis operations create images from other images or non-image data. Image
synthesis operations generally create images that are either physically impossible or impractical to
acquire.
7
3.3 APPLICATIONS OF DIGITAL IMAGE PROCESSING
Digital image processing has a broad spectrum of applications, such as remote sensing via
satellites and other spacecrafts, image transmission and storage for business applications, medical
processing, radar, sonar and acoustic image processing, robotics and automated inspection of
industrial parts.
3.3.3 COMMUNICATION
It is used in scanning, and transmission for converting paper documents to a digital image
form, compressing the image, and storing it on magnetic tape. It is also used in document reading
for automatically detecting and recognizing printed characteristics.
8
3.3.6 DEFENSE/INTELLIGENCE
9
CHAPTER 4
MATLABs built-in functions provide excellent tools for linear algebra computations, data
analysis, signal processing, optimization, numerical solutions of ODES, quadrature and many
other types of scientific computations. Most of these functions use the state-of-the art algorithms.
There are numerous functions for 2-D and 3-D course, MATLAB even provides an external
interface to run those programs from within MATLAB. The user, however, is not limited to the
built-in functions, he can write his own functions in the MATLAB language. Once written, these
functions behave just like the built-in functions. MATLAB’s language is very easy to learn and to
use.
There are several optional ‘Toolboxes’ available from the developers of the MATLAB.
These tool boxes are collection of functions written for special applications such as Symbolic
Computations Toolbox, Image Processing Toolbox, Statistics Toolbox, and Neural Networks
Toolbox, Communications Tool box, Signal Processing Toolbox, Filter Design Toolbox, Fuzzy
Logic Toolbox, Wavelet Toolbox, Data base Toolbox, Control System Toolbox, Bioinformatics
Toolbox, Mapping Toolbox.
10
4.3 BASICS OF MATLAB
On all UNIX systems, Macs, and PC, MATLAB works through three basic windows. They
are:
a. Command window
This is the main window. The MATLAB command prompt characterizes it ‘>>’.when you
launch the application program, MATLAB puts you in this window .All commands, including
those for running user-written programs, are typed in this window at the MATLAB prompt.
b. Graphics window
The output of all graphics commands are typed in the command window are flushed to the
graphics or figure window, a separate gray window with(default) white background colour. The
user can create as many figure windows, as the system memory will allow.
c. Edit window
This is where you write, edit, create, and save your own programs in files called M-files.
We can use any text editor to carry out these tasks. On the most systems, such as PC’s and Macs,
MATLAB provides its build in editor. On other systems, you can invoke the edit window by typing
the standard file editing command that you normally use on your systems. The command is typed
at the MATLAB prompt following the special character ‘!’ character. After editing is completed,
the control is returned to the MATLAB.
11
4.3.2 On-Line Help
a. On-line documentation
MATLAB provides on-line help for all its built-in functions and programming language
constructs. The commands look for, help, help win, and helpdesk provides on-line help.
b. Demo
MATLAB has a demonstration program that shows many of its features. The program
includes a tutorial introduction that is worth trying. Type demo at the MATLAB prompt to invoke
the demonstration program, and follow the instruction on the screen.
4.3.3 Input-Output
MATLAB supports interactive computation taking the input from the screen, and flushing
the output to the screen. In addition, it can read input files and write output files. The following
features hold for all forms of input-output.
a. Data type
The fundamental data type in the MATLAB is the array. It encompasses several distinct
data objects-integers, doubles, matrices, character strings, and cells. In most cases, however, we
never have to worry about the data type or the data object declarations. For example there is no
need to declare variables, as real or complex .When a real number is entered as the variable,
MATLAB automatically sets the variable to be real.
b. Dimensioning
12
C. Case sensitivity
MATLAB is case sensitive i.e. it differentiates between the lower case and the Uppercase letters.
Thus a and an are different variables. Most MATLAB commands are built-in function calls are
typed in lower case letters. We can turn case sensitivity on and off with casesen command.
d. Output display
The output of every command is displayed on the screen unless MATLAB is directed otherwise.
A semicolon at the end of a command suppresses the screen output, except for graphics and on-
line help command. The following facilities are provided for controlling the screen output.
i. Paged output To direct the MATLAB to show one screen of output at a time, type more on the
MATLAB prompt. Without it, MATLAB flushes the entire output at once, without regard to the
speed at which we read.
ii. Output format Though computations inside the MATLAB are performed using the double
precision, the appearance of floating point numbers on the screen is controlled by the output format
in use. There are several different screen output formats. The following table shows the printed
value of 10pi in different formats.
Table 4.1
13
e. Command History
MATLAB saves previously typed commands in a buffer. These commands can be called
with the up-arrow key. This helps in editing previous commands. You can also recall a previous
command by typing the first characters and then pressing the up-arrow key. On most Unix systems,
MATLABS command line editor also understands the standard emacs key bindings.
M-files: M-files are standard ASCII text files, with a .m extension to the file name. There are two
types of these files: script files and function files. Most programs we write in MATLAB are saved
as M-files. All built-in functions in MATLAB are M-files, most of which reside on our computer
in precompiled format. Some built in functions are provided with source code in readable M-files
so that can be copied and modified.
Mat-files: Mat-files are binary data-files with a .mat extension to the file name. Mat-files are
created by MATLAB when we save data with the save command. The data is written in a special
format that only MATLAB can read. Mat-files can be loaded into MATLAB with the load
command.
Mex-files: Mex-files are MATLAB callable Fortran and C programs, with a.mex extension to the
file name. Use of these files requires some experience with MATLAB and a lot of patience.
14
for opening, writing, editing, saving and printing files whereas on Unix machines such as sun
workstations, these tasks are usually performed with Unix commands.
The project has involved understanding data in MATLAB, so below is a brief review of
how images are handled. Indexed images are represented by two matrices, a color map matrix and
image matrix.
(i)The color map is a matrix of values representing all the colours in the image.
(ii)The image matrix contains indexes corresponding to the colour map color map.
A color map matrix is of size N*3, where N is the number of different colors I the image. Each
row represents the red, green, blue components for a colour.
Represents two colours, the first have components r1, g1,b1 and the second having the
components r2,g2,b2
The wavelet toolbox only supports indexed images that have linear, monotonic color maps.
Often color images need to be pre-processed into a grey scale image before using wavelet
decomposition. The Wavelet Toolbox User’s Guide provides some sample code to convert color
images into grey scale. This will be useful if it is needed to put any images into MATLAB.
This chapter dealt with introduction to MATLAB software which we are using for our
project. The 2-D wavelet Analysis, the decomposition of an image into approximations and details
and the properties of different types of wavelets will be discussed in the next chapter.
15
4.4 MATLAB
Matlab is a high-performance language for technical computing.
It integrates computation, programming and visualization in a user-friendly environment
where problems and solutions are expressed in an easy-to-understand mathematical
notation.
Matlab is an interactive system whose basic data element is an array that does not
Require dimensioning.
This allows the user to solve many technical computing problems, especially those with
matrix and vector operations, in less time than it would take to write a program in a scalar
non-interactive language such as C or FORTRAN.
Matlab features a family of application-specific solutions which are called toolboxes.
It is very important to most users of Matlab, that toolboxes allow to learn and apply
Specialized technology.
These toolboxes are comprehensive collections of Matlab functions, so-called M-files that
extend the Matlab environment to solve particular classes of problems.
Matlab is a matrix-based programming tool. Although matrices often need not to be
Dimensioned explicitly, the user has always to look carefully for matrix dimensions.
If it is not defined otherwise, the standard matrix exhibits two dimensions n × m.
Column vectors and row vectors are represented consistently by n × 1 and 1 × n matrices,
respectively.
16
4.4.2 EXPRESSIONS
Variables
Numbers
Operators
Functions
4.4.3 VARIABLES
17
Some examples of legal numbers are:
7 -55 0.0041 9.657838 6.10220e-10 7.03352e21 2i -2.71828j 2e3i 2.5+1.7j.
4.4.5 OPERATORS
Expressions use familiar arithmetic operators and precedence rules. Some examples are:
+ Addition
- Subtraction
* Multiplication
/ Division
’ Complex conjugate transpose
( ) Brackets to specify the evaluation order.
4.4.6 FUNCTIONS
18
CHAPTER 5
5.1 INTRODUCTION
When working with images in Matlab, there are many things to keep in mind such as
loading an image, using the right format, saving the data as different data types, how to display an
image, conversion between different image formats, etc. This worksheet presents some of the
commands designed for these operations. Most of these commands require you to have the Image
processing tool box installed with Matlab. To find out if it is installed, type ver at the Matlab
prompt. This gives you a list of what tool boxes that are installed on your system.
For further reference on image handling in Matlab you are recommended to use Mat lab’s
help browser. There is an extensive (and quite good) on-line manual for the Image processing tool
box that you can access via Mat lab’s help browser.
The first sections of this worksheet are quite heavy. The only way to understand how the
presented commands work is to carefully work through the examples given at the end of the
worksheet. Once you can get these examples to work, experiment on your own using your favorite
image!
5.2 FUNDAMENTALS
A digital image is composed of pixels which can be thought of as small dots on the screen.
A digital image is an instruction of how to color each pixel. We will see in detail later on how this
is done in practice. A typical size of an image is 512-by-512 pixels. Later on in the course you will
see that it is convenient to let the dimensions of the image to be a power of 2. For example, 29=512.
In the general case we say that an image is of size m-by-n if it is composed of m pixels in the
vertical direction and n pixels in the horizontal direction.
Let us say that we have an image on the format 512-by-1024 pixels. This means that the
data for the image must contain information about 524288 pixels, which requires a lot of memory!
Hence, compressing images is essential for efficient image processing. You will later on see how
19
Fourier analysis and Wavelet analysis can help us to compress an image significantly. There are
also a few "computer scientific" tricks (for example entropy coding) to reduce the amount of data
required to store an image.
BMP
HDF
JPEG
PCX
TIFF
XWB
Most images you find on the Internet are JPEG-images which is the name for one of the most
widely used compression standards for images. If you have stored an image you can usually see
from the suffix what format it is stored in. For example, an image named myimage.jpg is stored in
the JPEG format and we will see later on that we can load an image of this format into Matlab
.
If an image is stored as a JPEG-image on your disc we first read it into Matlab. However,
in order to start working with an image, for example perform a wavelet transform on the image,
we must convert it into a different format. This section explains four common formats.
This is the equivalent to a "gray scale image" and this is the image we will mostly work
with in this course. It represents an image as a matrix where every element has a value
corresponding to how bright/dark the pixel at the corresponding position should be colored. There
are two ways to represent the number that represents the brightness of the pixel: The double class
20
(or data type). This assigns a floating number ("a number with decimals") between 0 and 1 to each
pixel.
The value 0 corresponds to black and the value 1 corresponds to white. The other class is
called uint8 which assigns an integer between 0 and 255 to represent the brightness of a pixel. The
value 0 corresponds to black and 255 to white. The class uint8 only requires roughly 1/8 of the
storage compared to the class double. On the other hand, many mathematical functions can only
be applied to the double class. We will see later how to convert between double and uint8.
21
5.4.5 MULTIFRAME IMAGE
In some applications we want to study a sequence of images. This is very common in
biological and medical imaging where you might study a sequence of slices of a cell. For these
cases, the multiframe format is a convenient way of working with a sequence of images. In case
you choose to work with biological imaging later on in this course, you may use this format.
(Within the parenthesis you type the name of the image you wish to convert.)
22
The command mat2gray is useful if you have a matrix representing an image but the values
representing the gray scale range between, let's say, 0 and 1000. The command mat2gray
automatically re scales all entries so that they fall within 0 and 255 (if you use the uint8 class) or 0
and 1 (if you use the double class).
When you store an image, you should store it as a uint8 image since this requires far less
memory than double. When you are processing an image (that is performing mathematical
operations on an image) you should convert it into a double. Converting back and forth between
these classes is easy.
I=im2double (I);
I=im2uint8 (I);
When you encounter an image you want to work with, it is usually in form of a file (for
example, if you down load an image from the web, it is usually stored as a JPEG-file). Once we
are done processing an image, we may want to write it back to a JPEG-file so that we can, for
example, post the processed image on the web. This is done using the imread and imwrite
commands. These commands require the Image processing tool box!
23
Reading and writing image files
Matlab
Operation:
command:
Read an image. (Within the parenthesis you type the name of the image file you
imread()
wish to read. Put the file name within single quotes ' '.)
Write an image to a file.(As the first argument within the parenthesis you type the
name of the image you have worked with.
As a second argument within the parenthesis you type the name of the file and imwrite( , )
format that you want to write the image to.
Put the file name within single quotes ' '.)
Make sure to use semi-colon ; after these commands, otherwise you will get LOTS OF
number scrolling on you screen... The commands imread and imwrite support the formats given in
the section "Image formats supported by Matlab" above.
This section explains how to load and save variables in Matlab. Once you have read a file, you
probably convert it into an intensity image (a matrix) and work with this matrix. Once you are
done you may want to save the matrix representing the image in order to continue to work with
this matrix at another time. This is easily done using the commands save and load. Note that save
and load are commonly used Matlab commands, and works independently of what tool boxes that
are installed.
24
Loading and saving variables
Table 5.3
25
CHAPTER 6
Face recognition systems have been grabbing high attention from commercial market point
of view as well as pattern recognition field. Face recognition has received substantial attention
from researches in biometrics, pattern recognition field and computer vision communities. The
face recognition systems can extract the features of face and compare this with the existing
database. The faces considered here for comparison are still faces. Machine recognition of faces
from still and video images is emerging as an active research area. The present paper is formulated
based on still or video images captured either by a digital camera or by a web cam. The face
recognition system detects only the faces from the image scene, extracts the descriptive features.
It later compares with the database of faces, which is collection of faces in different poses. The
present system is trained with the database where the images are taken in different poses, with
glasses, with and without beard.
One of the simplest and most effective PCA approaches used in face recognition systems
is the so-called eigenface approach. This approach transforms faces into a small set of essential
characteristics, eigenfaces, which are the main components of the initial set of learning images
(training set). Recognition is done by projecting a new image in the eigenface subspace, after
which the person is classified by comparing its position in eigenface space with the position of
known individuals.The advantage of this approach over other face recognition systems is in its
simplicity, speed and insensitivity to small or gradual changes on the face. The problem is limited
to files that can be used to recognize the face. Namely, the images must be vertical frontal views
of human faces.
26
The whole recognition process involves two steps:
A. Initialization process
B. Recognition process
The Initialization process involves the following operations:
i. Acquire the initial set of face images called as training set.
ii. Calculate the Eigenfaces from the training set, keeping only the highest
eigenvalues. These M images define the face space. As new faces are experienced, the eigenfaces
can be updated or recalculated.
iii. Calculate distribution in this M-dimensional space for each known person by
projecting his or her face images onto this face-space. These operations can be performed from
time to time whenever there is a free excess operational capacity. This data can be cached which
can be used in the further steps eliminating the overhead of re-initializing, decreasing execution
time thereby increasing the performance of the entire system. Having initialized the system, the
next process involves the steps: i. Calculate a set of weights based on the input image and the M
eigenfaces by projecting the input image onto each of the Eigenfaces.
iv. Determine if the image is a face at all (known or unknown) by checking to see
if the image is sufficiently close to a ―free space‖.
v. If it is a face, then classify the weight pattern as either a known person or as
unknown.
vi. Update the eigenfaces or weights as either a known or unknown, if the same
unknown person face is seen several times then calculate the characteristic weight pattern and
incorporate into known faces. The last step is not usually a requirement of every system and hence
the steps are left optional and can be implemented as when the there is a requirement.
27
6.3 MODULES
USER
1. Login
2. Set Dataset path
3. Set Database path
4. Comparison process (PCA based Algorithm)
5. Show results.
USER
28
6.5 SYSTEM REQUIREMENTS
29
CHAPTER 7
System Design
30
7.1.2 CLASS DIAGRAM
31
7.1.4 ACTIVITY DIAGRAM
32
7.1.5 DATA FLOW DIAGRAM
7.1.6 ER DIAGRAM
33
CHAPTER 8
Implementation
Main
function varargout = Main_Face(varargin)
% MAIN_FACE MATLAB code for Main_Face.fig
% MAIN_FACE, by itself, creates a new MAIN_FACE or raises the existing
% singleton*.
%
% H = MAIN_FACE returns the handle to a new MAIN_FACE or the handle to
% the existing singleton*.
%
% MAIN_FACE('CALLBACK',hObject,eventData,handles,...) calls the local
% function named CALLBACK in MAIN_FACE.M with the given input arguments.
%
% MAIN_FACE('Property','Value',...) creates a new MAIN_FACE or raises the
% existing singleton*. Starting from the left, property value pairs are
% applied to the GUI before Main_Face_OpeningFcn gets called. An
% unrecognized property name or invalid value makes property application
% stop. All inputs are passed to Main_Face_OpeningFcn via varargin.
%
% *See GUI Options on GUIDE's Tools menu. Choose "GUI allows only one
% instance to run (singleton)".
%
% See also: GUIDE, GUIDATA, GUIHANDLES
34
if nargout
[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT
% --- Outputs from this function are returned to the command line.
function varargout = Main_Face_OutputFcn(hObject, eventdata, handles)
% varargout cell array for returning output args (see VARARGOUT);
% hObject handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
35
prompt = {'Enter test image name (a number between 1 to 5):'};
dlg_title = 'Input of PCA-Based Face Recognition System';
num_lines= 1;
def = {' '};
TestImage = inputdlg(prompt,dlg_title,num_lines,def);
TestImage = strcat(testpath,'\',char(TestImage),'.jpg');
recog_img = facerecog(datapath,TestImage);
selected_img = strcat(datapath,'\',recog_img);
test_img = imread(TestImage);
axes(handles.axes1);
imshow(test_img);
title('Test Image');
select_img = imread(selected_img);
axes(handles.axes2)
imshow(select_img); title('Recognized Image');
INP1=im2bw(test_img);
INPUT1=im2bw(select_img)
S=corr2(INPUT1,INP1);
S
if S>0.5
uiwait(msgbox('Matched'));
i=index(1);
if i ==1 || i==16 ||i==31 ||i==32
name='Rajesh Mishra';
Add='No:12 Anna Nagar,Chennai';
elseif i ==2 || i==17||i==32
name='Alok Nath';
Add='No:1/24 Anna Salai,Vellore';
elseif i ==3 || i==18 ||i==33
name='Brijesh Kumar';
Add='No: 32/10, Abdul Aziz Street,Chennai';
elseif i ==4 || i==19 ||i==34
name='Anurag Kesawar';
Add='No.41, Rangaiyya Street,Ayanavaram, Chennai';
elseif i ==5 || i==20 ||i==35
name='Shambhu Nath';
Add='Plot No. 160,1" Cross Street, Vellore';
elseif i ==6 || i==21 ||i==36
name='Ritesh Kumar';
Add='No:1/4 KK Nagar, Chennai';
elseif i ==7 || i==22 ||i==37
name='D K Mishra';
36
Add='F2 2/2, E.V.R Periyar Nagar,Chennai';
elseif i ==8 || i==23 ||i==38
name='Shivakanth Pathak';
Add='No. 6/50G, Shanti Path, Chanakyapuri, New Delhi';
elseif i ==9 || i==24 ||i==39
name='O P Tiwari';
Add='No:14/1 Shantipath, Chanakyapuri,New Delhi';
elseif i ==10 || i==25 ||i==40
name='Bijoy Jha';
Add='24, Kasturba Gandhi Marg,New Delhi';
elseif i ==11 || i==26 ||i==41
name='P K Mishra';
Add='38/A, Jawahar Lal Nehru Road,Kolkata';
elseif i ==12 || i==27 ||i==42
name='Dojer Kera';
Add='C-49, G-Block, Bandra East, Mumbai ';
elseif i ==13 || i==28 ||i==43
name='Amit Kumar';
Add='M-26/6,Ashok Nagar, Chennai';
elseif i ==14 || i==29 ||i==44
name='Andrew Rendou';
Add='1st Avenue,Ambattur, Chennai';
elseif i ==15 || i==30 ||i==45
name='Tigmanshu Mishra';
Add='No:24/53 Manimegalai Street, Madipakkam, Chennai';
end
na=name;
ad=Add;
s1=strcat('Name : ',name);
s2=strcat('Address : ',Add);
s={s1;s2};
set(handles.edit2,'String',s);
else
uiwait(msgbox('Notmatched'));
end
37
global img
n=2;
interval=3;
photosave = inputdlg(' Do you want to save the files(y/n): ','Registration Form');
photosave = str2num(photosave{:});
if ~exist(outputFolder, 'dir')
end
obj = videoinput('winvideo', 1);
set(obj, 'ReturnedColorSpace', 'RGB');
disp('First shot will taken after 1 second');
pause(1);
for i=1:n
preview(obj)
img=getsnapshot(obj);
closepreview(obj);
if(photosave == 'y')
outputBaseFileName = sprintf('%d.jpg',i);
outputFullFileName = fullfile(outputFolder, outputBaseFileName);
imwrite(img,outputFullFileName,'jpg');
end
pause(interval);
end
n= 5;
outputFolder = fullfile('C:\Users\Matlab\Desktop\Face recognition real time\Dataset');
if ~exist(outputFolder, 'dir')
end
for i=1:n
outputBaseFileName= sprintf('%d.jpg',i);
outputFullFileName = fullfile(outputFolder, outputBaseFileName);
imwrite(img,outputFullFileName,'jpg');
axes(handles.axes1);
imshow(img),title('Login Image')
end
38
% handles structure with handles and user data (see GUIDATA)
global img1
n=2;
interval=3;
photosave = inputdlg(' Do you want to save the files(y/n): ','Registration Form');
photosave = str2num(photosave{:});
if ~exist(outputFolder, 'dir')
end
obj = videoinput('winvideo', 1);
set(obj, 'ReturnedColorSpace', 'RGB');
disp('First shot will taken after 1 second');
pause(1);
for i=1:n
preview(obj)
img1=getsnapshot(obj);
closepreview(obj);
if(photosave == 'y')
outputBaseFileName = sprintf('%d.jpg',i);
outputFullFileName = fullfile(outputFolder, outputBaseFileName);
imwrite(img1,outputFullFileName,'jpg');
end
pause(interval);
end
n= 5;
outputFolder = fullfile('C:\Users\Matlab\Desktop\Face recognition real time\Database');
if ~exist(outputFolder, 'dir')
end
for i=1:n
outputBaseFileName = sprintf('%d.jpg',i);
outputFullFileName = fullfile(outputFolder, outputBaseFileName);
imwrite(img1,outputFullFileName,'jpg');
axes(handles.axes1);
imshow(img1),title('Registered Image')
end
facerecog:
function [recognized_img]=facerecog(datapath,testimg)
39
% In this part of function, we align a set of face images (the training set x1, x2, ... , xM )
%
% This means we reshape all 2D images of the training database
% into 1D column vectors. Then, it puts these 1D column vectors in a row to
% construct 2D matrix 'X'.
%
%
% datapath - path of the data images used for training
% X - A 2D matrix, containing all 1D image vectors.
% Suppose all P images in the training database
% have the same size of MxN. So the length of 1D
% column vectors is MxN and 'X' will be a (MxN)xP 2D matrix.
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%
%%%%%%%%% finding number of training images in the data path specified as argument
%%%%%%%%%%
X = [];
for i = 1 : imgcount
str = strcat(datapath,'\',int2str(i),'.jpg');
img = imread(str);
img = rgb2gray(img);
[r c] = size(img);
temp = reshape(img',r*c,1); %% Reshaping 2D images into 1D image vectors
%% here img' is used because reshape(A,M,N) function reads the matrix A
columnwise
%% where as an image matrix is constructed with first N pixels as first
row,next N in second row so on
X = [X temp]; %% X,the image matrix with columnsgetting added for each image
end
40
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%
% Now we calculate m, A and eigenfaces.The descriptions are below :
%
% m - (MxN)x1 Mean of the training images
% A - (MxN)xP Matrix of image vectors after each vector getting subtracted from
the mean vector m
% eigenfaces - (MxN)xP' P' Eigenvectors of Covariance matrix (C) of training database
X
% where P' is the number of eigenvalues of C that best represent the
feature set
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%% calculating A matrix, i.e. after subtraction of all image vectors from the mean
image vector %%%%%%
A = [];
for i=1 : imgcount
temp = double(X(:,i)) - m;
A = [A temp];
end
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%% CALCULATION OF
EIGENFACES
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%
%%% we know that for a MxN matrix, the maximum number of non-zero eigenvalues that its
covariance matrix can have
%%% is min[M-1,N-1]. As the number of dimensions (pixels) of each image vector is very high
compared to number of
%%% test images here, so number of non-zero eigenvalues of C will be maximum P-1 (P being
the number of test images)
%%% if we calculate eigenvalues & eigenvectors of C = A*A' , then it will be very time
consuming as well as memory.
%%% so we calculate eigenvalues & eigenvectors of L = A'*A , whose eigenvectors will be
linearly related to eigenvectors of C.
%%% these eigenvectors being calculated from non-zero eigenvalues of C, will represent the
best feature sets.
41
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%
L= A' * A;
[V,D]=eig(L); %% V : eigenvector matrix D : eigenvalue matrix
%%%% again we use Kaiser's rule here to find how many Principal Components (eigenvectors)
to be taken
%%%% if corresponding eigenvalue is greater than 1, then the eigenvector will be chosen for
creating eigenface
L_eig_vec = [];
for i = 1 : size(V,2)
if( D(i,i) > 1 )
L_eig_vec = [L_eig_vec V(:,i)];
end
end
%In this part of recognition, we compare two faces by projecting the images into facespace and
% measuring the Euclidean distance between them.
%
% recogimg - the recognized image name
% testimg - the path of test image
% m - mean image vector
% A - mean subtracted image vector matrix
% eigenfaces - eigenfaces that are calculated from eigenface function
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
%%%%%%% finding the projection of each image vector on the facespace (where the
eigenfaces are the co-ordinates or dimensions) %%%%%
test_image = imread(testimg);
test_image = test_image(:,:,1);
42
[r c] = size(test_image);
temp = reshape(test_image',r*c,1); % creating (MxN)x1 image vector from the 2D image
temp = double(temp)-m; % mean subtracted vector
projtestimg = eigenfaces'*temp; % projection of test image onto the facespace
%%%%% calculating & comparing the euclidian distance of all projected trained images from
the projected test image %%%%%
euclide_dist = [ ];
for i=1 : size(eigenfaces,2)
temp = (norm(projtestimg-projectimg(:,i)))^2;
euclide_dist = [euclide_dist temp];
end
[euclide_dist_min recognized_index] = min(euclide_dist);
recognized_img = strcat(int2str(recognized_index),'.jpg');
attendance:
43
% Begin initialization code - DO NOT EDIT
gui_Singleton = 1;
gui_State = struct('gui_Name', mfilename, ...
'gui_Singleton', gui_Singleton, ...
'gui_OpeningFcn', @facerecog_OpeningFcn, ...
'gui_OutputFcn', @facerecog_OutputFcn, ...
'gui_LayoutFcn', [] , ...
'gui_Callback', []);
if nargin && ischar(varargin{1})
gui_State.gui_Callback = str2func(varargin{1});
end
if nargout
[varargout{1:nargout}] = gui_mainfcn(gui_State, varargin{:});
else
gui_mainfcn(gui_State, varargin{:});
end
% End initialization code - DO NOT EDIT
% --- Outputs from this function are returned to the command line.
function varargout = facerecog_OutputFcn(hObject, eventdata, handles)
% varargout cell array for returning output args (see VARARGOUT);
% hObject handle to figure
% eventdata reserved - to be defined in a future version of MATLAB
% handles structure with handles and user data (see GUIDATA)
44
varargout{1} = handles.output;
a(:,:,i)=imresize(rgb2gray(imread(strcat('C:\Users\Matlab\Desktop\Source
code\TestData\',num2str(i),'.jpg'))),[200,200]);
end
% imshow(uint8(a(:,:,8)));
%data read
%eigen face vectors
b=zeros(40000,1,75);
for j=1:75
c=a(:,:,j)';
d=c(:);
b(:,1,j)=d;
waitbar(j/300);
end
% avg face vector
m=mean(b(:,1,:),3);
% normalized vectors
for i=1:75
n(:,1,i)=b(:,1,i)-m;
end
c=zeros(40000,75);
45
for i=1:75
c(:,i)=n(:,1,i);
waitbar((75/300)+(i/300));
end
comat=c'*c;
%size(comat)
%eigen faces
for i=1:75
u(:,i)=c(:,:)*comat(:,i);
waitbar(0.5+(i/300));
end
for i=1:75
w(:,i)=inv(u'*u)*(u'*n(:,1,i));
waitbar(0.75+(i/300));
end
newim=handles.data';
newimag=newim(:);
w(:,76)=inv(u'*u)*(u'*(double(newimag)-m));
%distance computations
d=zeros(75,1);
% v=zeros(13,1);
for i=1:75
v=w(:,76)-w(:,i);
d(i)=sqrt(v'*v);
end
[a,index]=sort(d);
axes(handles.axes1);
imshow(imresize(rgb2gray(imread((strcat('C:\Users\Matlab\Desktop\Source
code\Database\',strcat(num2str(index(1))),'.jpg')))),[200,200]));
i=index(1);
if i ==1 || i==2 ||i==31 ||i==32 || i==33
name='Rajesh Mishra';
Add='No:12 Anna Nagar,Chennai';
elseif i ==3 || i==4 ||i==34 ||i==35 || i==36
name='Alok Nath';
Add='No:1/24 Anna Salai,Vellore';
elseif i ==5 || i==6 ||i==37 ||i==38 || i==39
name='Brijesh Kumar';
Add='No: 32/10, Abdul Aziz Street,Chennai';
elseif i ==7 || i==8 ||i==40 ||i==41 || i==42
name='Anurag Kesawar';
Add='No.41, Rangaiyya Street,Ayanavaram, Chennai';
elseif i ==9 || i==10 ||i==43 ||i==44 || i==45
46
name='Shambhu Nath';
Add='Plot No. 160,1" Cross Street, Vellore';
elseif i ==11 || i==12 ||i==46 ||i==47 || i==48
name='Ritesh Kumar';
Add='No:1/4 KK Nagar, Chennai';
elseif i ==13 || i==14 ||i==49 ||i==50 || i==51
name='D K Mishra';
Add='F2 2/2, E.V.R Periyar Nagar,Chennai';
elseif i ==15 || i==16 ||i==52 ||i==53 || i==54
name='Shivakanth Pathak';
Add='No. 6/50G, Shanti Path, Chanakyapuri, New Delhi';
elseif i ==17 || i==18 ||i==55 ||i==56 || i==57
name='O P Tiwari';
Add='No:14/1 Shantipath, Chanakyapuri,New Delhi';
elseif i ==19 || i==20 ||i==58 ||i==59 || i==60
name='Bijoy Jha';
Add='24, Kasturba Gandhi Marg,New Delhi';
elseif i ==22 || i==21 ||i==61 ||i==62 || i==63
name='P K Mishra';
Add='38/A, Jawahar Lal Nehru Road,Kolkata';
elseif i ==24 || i==23 ||i==64 ||i==65 || i==66
name='Dojer Kera';
Add='C-49, G-Block, Bandra East, Mumbai ';
elseif i ==26 || i==25 ||i==67 ||i==68 || i==69
name='Amit Kumar';
Add='M-26/6,Ashok Nagar, Chennai';
elseif i ==27 || i==28 ||i==70 ||i==71 || i==72
name='Andrew Rendou';
Add='1st Avenue,Ambattur, Chennai';
elseif i ==29 || i==30 ||i==73 ||i==74 || i==75
name='Tigmanshu Mishra';
Add='No:24/53 Manimegalai Street, Madipakkam, Chennai';
end
% tym=datestr(now,'HH:MM:SS');
na=name;
ad=Add;
dt=date;
s1=strcat('Name : ',name);
s2=strcat('Address : ',Add);
s={s1;s2};
set(handles.text2,'String',s);
filename='Studentdata.xlsx';
N=na; A=ad; Att='Present';dd=dt;
fileExist = exist(filename,'file');
if fileExist==0
header = {'Name', 'Address','Attendence','Date'};
47
xlswrite(filename,header);
else
[~,~,input] = xlsread(filename); % Read in your xls file to a cell array (input)
new_data = {N, A,Att,dd}; % This is a cell array of the new line you want to add
output = cat(1,input,new_data); % Concatinate your new data to the bottom of input
xlswrite(filename,output); % Write to the new excel file.
end
48
8.2 SCREEN SHOT
8.2.1 Registration
49
8.2.2 Login
50
8.2.3 Matching
51
8.2.4 Store Student Details
52
CHAPTER 9
Testing Strategy
The purpose of testing is to discover errors. Testing is the process of trying to discover
every conceivable fault or weakness in a work product. It provides a way to check the functionality
of components, sub assemblies, assemblies and/or a finished product It is the process of exercising
software with the intent of ensuring that the
Software system meets its requirements and user expectations and does not fail in an unacceptable
manner. There are various types of test. Each test type addresses a specific testing requirement.
53
9.2.2 Integration testing
Integration tests are designed to test integrated software components to determine if they
actually run as one program. Testing is event driven and is more concerned with the basic outcome
of screens or fields. Integration tests demonstrate that although the components were individually
satisfaction, as shown by successfully unit testing, the combination of components is correct and
consistent. Integration testing is specifically aimed at exposing the problems that arise from the
combination of components.
Business process flows; data fields, predefined processes, and successive processes must
be considered for testing. Before functional testing is complete, additional tests are identified and
the effective value of current tests is determined.
54
configuration oriented system integration test. System testing is based on process descriptions and
flows, emphasizing pre-driven process links and integration points.
Unit testing is usually conducted as part of a combined code and unit test phase of the
software lifecycle, although it is not uncommon for coding and unit testing to be conducted as two
distinct phases.
Test objectives
All field entries must work properly.
Pages must be activated from the identified link.
The entry screen, messages and responses must not be delayed.
55
Features to be tested
Verify that the entries are of the correct format
No duplicate entries should be allowed
All links should take the user to the correct page.
Acceptance Testing
User Acceptance Testing is a critical phase of any project and requires significant
participation by the end user. It also ensures that the system meets the functional requirements.
Test Results: All the test cases mentioned above passed successfully. No defects encountered.
56
CHAPTER 10
Conclusion
Experimental results have shown that, the proposed face recognition method was very
sensitive to face background and head orientations. Changes in the illumination did not cause a
major problem to the system. Besides, presence of small detail such as dark glasses or masks was
too far from being a real challenge to the system. There exists a trade off between the correct
recognition rate and the threshold value. As the threshold value increases, numbers of misses begin
to decrease, possibly resulting in misclassifications. On the contrary, when the number of
eigenfaces involved in the recognition process increases misclassification rate begins to decrease,
possibly resulting in misses. The eigenface method is very sensitive to head orientations, and most
of the mismatches occur for the images with large head orientations.
The current recognition system has been designed for frontal views of face images. A
neural network architecture (may be together with a feature based approach) can be implemented
in which the orientation of the face is first determined, and then the most suitable recognition
method is selected, Also the current recognition system acquires face images only from face files
located on magnetic mediums. Camera and scanner support should be implemented for greater
flexibility.
57
REFRENCES
[1] M. A. Turk and A. P. Pentland, “Face Recognition Using Eigenfaces,”in Proc. IEEE
Conference on Computer Vision and PatternRecognition, pp. 586–591. 1991.
[2] Nirmalya Kar, Mrinal Kanti Debbarma, Ashim Saha, and Dwijen Rudra Pal” Study of
Implementing Automated Attendance SystemUsing Face Recognition Technique “International
Journal of Computer and Communication Engineering, Vol. 1, No. 2, July 2012.
[3] Goldstein, A. J., Harmon, L. D., and Lesk, A. B., Identification of human faces", Proc. IEEE
59, pp. 748- 760, (1971).
[4]M.N.Shah Zainudin., Radi H.R., S.Muniroh Abdullah., Rosman Abd. Rahim., M.Muzafar
Ismail,” Face Recognition using PCA and LDA”, International Journal of Electrical & Computer
Sciences.
[5] Yohei KAWAGUCHI ,Tetsuo SHOJI ,Weijane LIN ,and Koh KAKUSHO , “Face
Recognition-based Lecture Attendance System”, Department of Intelligence Science and
Technology, Graduate School of Informatics, Kyoto University.
58