Escolar Documentos
Profissional Documentos
Cultura Documentos
Seminar Report On
N EURAL NETWORK
Submitted By
Suraj Maurya - 111P004
Sanjeev Vishawakarma - 111P019
Sandeep Warang - 111P006
Affiliated to
University of Mumbai
Rizvi College of Engineering
Department of Computer Engineering
New Rizvi Educational Complex, Off-Carter Road,
Bandra(w), Mumbai - 400050
CERTIFICATE
This is certify that
Suraj Maurya
Sanjeev Vishwakarma
Sandeep Warang
of Third Year Computer Engineering have completed the seminar work entitled Seminar Topic Ti-
tle under my supervision at Rizvi College of Engineering, Mumbai under the University of Mumbai.
Date:
Acknowledgements
I am profoundly grateful to Prof. Dinesh B. Deore for his expert guidance and continuous encourage-
ment throughout to see that this report rights its target since its commencement to its completion.
I would like to express deepest appreciation towards Dr. Varsha Shah, Principal RCOE, Mumbai and
Prof. Dinesh B. Deore HOD Computer Department whose invaluable guidance supported me in com-
pleting this report.
At last I must express my sincere heartfelt gratitude to all the staff members of Computer Engineering
Department who helped me directly or indirectly during this course of work.
Suraj Maurya
Sanjeev Vishwakarma
Sandeep Warang
ABSTRACT
This report presents an emergence of an Artificial Neural Network (ANN) as a tool for analysis of differ-
ent parameters of a system. An Artificial Neural Network (ANN) is an information-processing paradigm
that is inspired by the way biological nervous systems such as brain, process information. ANN consists
of multiple layers of simple processing elements called as neurons. The neuron performs two functions,
namely, collection of inputs & generation of an output. Use of ANN provides overview of the theory,
learning rules, and applications of the most important neural network models, definitions and style of
Computation. The mathematical model of network throws the light on the concept of inputs, weights,
summing function, activation function and outputs. Then ANN helps to decide the type of learning for
adjustment of weights with change in parameters. Finally the analysis of a system is completed by ANN
implementation & ANN training and prediction quality.
1 Introduction 1
1.1 Evoluation of Data Mining . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.2 Data mining functionality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
1.3 Data mining techniques . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2
2 Neural Network 3
2.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.2 Neuron in brain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3
2.3 Characteristic of ANN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
2.4 Activation Function . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5
3 Perceptron 8
3.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
3.2 Perceptron Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8
3.3 Classes of learning algorithms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 9
3.4 Learning algorithms for neural networks . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.4.1 Supervised Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.4.2 Unsupervised Learning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 10
3.4.3 Reinforcement . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
3.5 Perceptron Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11
5 back-propagation 14
5.1 Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
5.2 Back-propagation network . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 14
5.2.1 ANN Development & Implementation . . . . . . . . . . . . . . . . . . . . . . 14
5.2.2 ANN Training & Prediction quality . . . . . . . . . . . . . . . . . . . . . . . . 15
References 18
APPENDICES 18
A Project Hosting 19
List of Figures
Chapter 1
Introduction
Data mining is the semi-automatic discovery of patterns, associations ,changes, anomalies, and statisti-
cally significant structures and events in data. Traditional data analysis is assumption driven in the sense
that a hypothesis is formed and validated against the data. Data mining, in contrast, is data driven in
the sense that patterns are automatically extracted from data. The goal of this tutorial is to provide an
introduction to data mining techniques. The focus will be on methods appropriate for mining massive
datasets using techniques from scalable and high performance computing. The techniques covered in-
clude association rules, sequence mining, decision tree classification, and clustering. Some aspects of
pre-processing and post processing are also covered. The problem of Predicting contact maps for protein
sequences is used as a detailed case study.
Data mining is process of identify patterns and establish relationships Data Mining defined as
The nontrivial extraction of implicit, previously unknown, and potentially useful information from data.
Data mining is the process of analyzing large amount of data stored is a data warhorse for useful informa-
tion which makes use of artificial intelligence techniques ,neural network ,and advance statistical tools
(such as cluster analysis) to reveal trends, patterns and relationship, which otherwise may be undetected.
[3]
Chapter 2
Neural Network
2.1 Introduction
first wave of interest in neural networks (also known as connectionist models or parallel Distributed
processing emerged after the introduction of simplified neurons by McCulloch and Pitts in 1943 (Mc-
Culloch & Pitts 1943) .These neurons were presented as models of biological neurons and as conceptual
components for circuits that could perform computational tasks. When Minsky and Papert published
their book Perceptrons in 1969 (Minsky and Papert 1969) in which they showed the deficiencies of per-
ceptron models most neural network funding was redirected and researchers left the field.Only a few
researchers continued their efforts, most notably Teuvo Kohonen, Stephen Grossberg, James Anderson,
and Kunihiko Fukushima.The interest in neural networks re-emerged only after some important theo-
retical results were attained in the early eighties (most notably the discovery of error back-propagation)
and new hardware developments increased the processing capacities. This renewed interest is reflected
in the number of scientists, the amounts of funding, the number of large conferences, and the number
of journals associated with neural networks, Nowadays most universities have a neural networks group,
within their psychology, physics, computer science, or biology departments. Artificial neural networks
can be most adequately characterized as computational models with particular properties such as the
ability to adapt or learn, to generalize, or to cluster or organize data, and which operation is based on
parallel processing. However, many of the above-mentioned properties can be attributed to existing
(non-neural) models, the intriguing question is to which extent the neural approach proves to be better
suited for certain applications than existing models. To date an equivocal answer to this question is not
found. Often parallels with biological systems are described. However, there is still so little known
(even at the lowest cell level) about biological systems, that the models we are using for our artificial
neural systems seem to introduce an oversimplification of the biological models.
Data mining is process of identify patterns and establish relationships Data Mining defined as
The nontrivial extraction of implicit, previously unknown, and potentially useful information from data.
Data mining is the process of analyzing large amount of data stored is a data warhorse for useful informa-
tion which makes use of artificial intelligence techniques ,neural network ,and advance statistical tools
(such as cluster analysis) to reveal trends, patterns and relationship, which otherwise may be undetected.
[3]
3. When the input exceeds a threshold the neuron sends an electrical spike that travels that travels
from the body, down the axon, to the next neuron(s)
Brains learn
1. Altering strength between neurons
2. Creating/deleting connections
A neuron ( neurone or nerve cell) is an electrically excitable cell that processes and transmits information
through electrical and chemical signals. These signals between neurons occur via synapses, specialized
connections with other cells. Neurons can connect to each other to form neural networks. Neurons are
the core components of the nervous system, which includes the brain, and spinal cord of the central
nervous system (CNS), and the ganglia of the peripheral nervous system (PNS). Specialized types of
neurons include: sensory neurons which respond to touch, sound, light and all other stimuli affecting
the cells of the sensory organs, that then send signals to the spinal cord and brain; motor neurons that re-
ceive signals from the brain and spinal cord, to cause muscle contractions, and affect glandular outputs,
and interneurons which connect neurons to other neurons within the same region of the brain or spinal
cord, in neural networks. A typical neuron possesses a cell body (soma), dendrites, and an axon. The
term neurite is used to describe either a dendrite or an axon, particularly in its undifferentiated stage.
Dendrites are thin structures that arise from the cell body, often extending for hundreds of micrometres
and branching multiple times, giving rise to a complex dendritic tree. An axon is a special cellular
extension that arises from the cell body at a site called the axon hillock and travels for a distance, as
far as 1 meter in humans or even more in other species. The cell body of a neuron frequently gives
rise to multiple dendrites, but never to more than one axon, although the axon may branch hundreds
of times before it terminates. At the majority of synapses, signals are sent from the axon of one neu-
ron to a dendrite of another. There are, however, many exceptions to these rules: neurons that lack
dendrites, neurons that have no axon, synapses that connect an axon to another axon or a dendrite to
another dendrite, etc. All neurons are electrically excitable, maintaining voltage gradients across their
membranes by means of metabolically driven ion pumps, which combine with ion channels embedded
in the membrane to generate intracellular-versus-extracellular concentration differences of ions such as
sodium, potassium, chloride, and calcium. Changes in the cross-membrane voltage can alter the func-
tion of voltage-dependent ion channels. If the voltage changes by a large enough amount, an all-or-none
electrochemical pulse called anaction potential is generated, which travels rapidly along the cells axon,
and activates synaptic connections with other cells when it arrives. Neurons do not undergo cell division.
In most cases, neurons are generated by special types of stem cells. A type of glial cell, called astrocytes
(named for being somewhat star-shaped), have also been observed to turn into neurons by virtue of the
stem cell characteristic pluripotency. In humans, neurogenesis largely ceases during adulthoodbut in two
brain areas, the hippocampus and olfactory bulb, there is strong evidence for generation of substantial
numbers of new neurons.[
Since the 1960s, database and information technology has been evolving systematically from prim-
itive le pro-cessing systems to sophisticated and powerful databases systems. The research and devel-
opment in database systems since the 1970s has led to the development of relational database systems
, data modelling tools, and indexing and data organization techniques. In addition, users gained conve-
nient and edible data access through query languages, query processing, and user interfaces. E- Client-
methods for on-line transaction processing (OLTP), where a query is viewed as a read-only transaction,
have contributed substantially to the evolution and wide acceptance of relational technology as a major
tool for e-client storage, retrieval, and management of large amounts of data.
Chapter 3
Perceptron
3.1 Introduction
Perceptrons redirects here. For the book of that title, see Perceptrons (book).
In machine learning, the perceptron is an algorithm for supervised classification of an input into one
of several possible non-binary outputs. It is a type of linear classifier, i.e. a classification algorithm that
makes its predictions based on a linear predictor function combining a set of weights with the feature
vector. The algorithm allows for online learning, in that it processes elements in the training set one at a
time.
The perceptron algorithm was invented in 1957 at the Cornell Aeronautical Laboratory by Frank
Rosenblatt.
Data mining is process of identify patterns and establish relationships Data Mining defined as
The nontrivial extraction of implicit, previously unknown, and potentially useful information from data.
Data mining is the process of analyzing large amount of data stored is a data warhorse for useful informa-
tion which makes use of artificial intelligence techniques ,neural network ,and advance statistical tools
(such as cluster analysis) to reveal trends, patterns and relationship, which otherwise may be undetected.
[3]
In some simple cases the weights for the computing units can be found through a sequential test of
stochastically generated numerical combinations. However, such algorithms which look blindly for a
solution do not qualify as learning. A learning algorithm must adapt the network parameters according
to previous experience until a solution is found, if it exists.
Supervised learning is further divided into methods which use reinforcement or error correction. Re-
inforcement learning is used when after each presentation of an input-output example we only know
whether the network produces the desired result or not. The weights are updated based on this informa-
tion (that is, the Boolean values true or false) so that only the input
The perceptron learning algorithm is an example of supervised learning with reinforcement. Some
of its variants use supervised learning with error correction (corrective learning).
3.4.3 Reinforcement
Reinforcement Learning is type of learning may be considered as an intermediate form of the above
two types of learning. Here the learning machine does some action on the environment and gets a
feedback response from the environment. The learning system grades its action good (rewarding) or bad
(punishable) based on the environmental response and accordingly adjusts its parameters. Generally,
parameter adjustment is continued until an equilibrium state occurs, following which there will be no
more changes in its parameters. The self organizing neural learning may be categorized under this type
of learning.
Chapter 4
4.1 Defination
Feedforward neural networks (FF networks) are the most popular and most widely used models in many
practical applications. They are known by many different names, such as multi-layer perceptrons.
The network output is formed by another weighted summation of the outputs of the neurons in the
hidden layer. This summation on the output is called the output layer. In Figure 2.5 there is only one
output in the output layer since it is a single-output problem. Generally, the number of output neurons
equals the number of outputs of the approximation problem. The neurons in the hidden layer of the
network in Figure 2.5 are similar in structure to those of the perceptron, with the exception that their
activation functions can be any differential function. The output of this network is given by
where n is the number of inputs and nh is the number of neurons in the hidden layer.
Chapter 5
back-propagation
5.1 Introduction
The back propagation algorithm (Rumelhart and McClelland, 1986) is used in layered feed-forward
ANNs. This means that the artificial neurons are organized in layers, and send their signals forward, and
then the errors are propagated backwards. The network receives inputs by neurons in the input layer,
and the output of the network is given by the neurons on an output layer. There may be one or more
intermediate hidden layers.
It is possible to improve generalization, if you modify the performance function by adding a term
that consists of the mean of the sum of the squares of the network weights & biases,
msereg = mse +(1-)msw,
Where is the performance ratio, &
Chapter 6
trends found in the human genome to aid in the understanding of the data compiled by the Human
Genome Project
self-diagnosis of medical problems using neural networks.
6.2 Conclusion
As the ANN is an emerging technology it can be used for data analysis in applications such as pattern
recognition, prediction, system identification & control. From above theories it can be seen that ANN is
a radial basis function back propagation network. The network is capable of predicting the parameters
by experimental system. The network has parallel structure and fast learning capacity. The collected
experimental data such as speed, load, & values of pressure distribution etc. are also employed as
training and testing data for an artificial neural network.
The neural network is a feed forward three layered network. Quick propagation algorithm is used
to update the weight of the network during the training. The ANN has a superior performance to fol-
low the desired results of the system and is employed to analyze such systems parameters in practical
applications.
References
Appendix A
Project Hosting
The report is shared at Academia.edu. The complete report about the seminar is uploaded here for future
reference.
QR CODE: