Escolar Documentos
Profissional Documentos
Cultura Documentos
Eng.Ismail El-Gayar
Under Supervision Of
Prof.Dr. Sheren Youssef
• Training Procedures
Introduction
Understanding the Improving Convergence
Brain Momentum
Neural Networks as a Adaptive Learning Rate
Paradigm for Parallel • Learning Time
Processing Time Delay Neural Networks
• The Perceptron Recurrent Networks
Network
• Training a Perceptron
• Multilayer
Perceptrons
• Backpropagation
Algorithm
Two-Class
:- Processing Human brain
contains a
massively
interconnecte
How our brain
d net of 1010 -
manipulates
1011 (10
with patterns ?
billion)
neurons
A process of
pattern recognition
and pattern
manipulation is
based on:
Associative distributed
Massive parallelism Connectionism memory
Brain computer as an Brain computer is a highly Storage of information in
information or signal interconnected neurons a brain is supposed to be
processing system, is system in such a way that concentrated in synaptic
composed of a large the state of one neuron connections of brain
number of a simple affects the potential of the neural network, or more
processing elements, called large number of other precisely, in the pattern of
neurons. These neurons neurons which are these connections and
are interconnected by connected according to strengths (weights) of the
numerous direct links, weights or strength. The key synaptic connections.
which are called idea of such principle is the
connection, and cooperate functional capacity of
which other to perform a biological neural nets 3
The Biological Neuron
:-
Synaps
es
Axon
from
other
Soma
neuron
Dendri
Axon te
Dendrite from
s other
The
schematic
model of a
1. Soma or body cell - is a large, round central biological
body in which almost all the logical
functions of the neuron are realized. neuron
2. The axon (output), is a nerve fibre attached to the soma which can serve as a final
output channel of the neuron. An axon is usually highly branched.
3. The dendrites (inputs)- represent a highly branching tree of fibres. These long
irregularly shaped nerve fibres (processes) are attached to the soma.
4. Synapses are specialized contacts on a neuron which are the termination points for
the axons from other neurons.
Brain-like Computer
Artificial Neural Network – Mathematical
Paradigms of Brain-Like Computer
The new paradigm of
computing mathematics
consists of the combination
of such artificial neurons into
Neurons and some artificial neuron net.
Neural Net
Brain-Like Computer
Brain-like computer –
ee Technical
Technical
Robotic
Robotic Diagnistic
Diagnistic
ss ss
Machine Intelligent
Intelligent
Machine
Vision Data
Data
Vision
Artificial Analysis
Analysis
Intellect andSignal
and Signal
with Neural Processing
Processing
Networks
Image&&
Image
Pattern
Pattern
Recognitio
Recognitio Intelligent
Intelligent
nn Expert
Expert
Systems
Systems
Intelligen
Intelligen Intelligent
Intelligent
tltl Security
Security
Medicine
Medicine Systems
Systems
Devices
Devices
8
8
Artificial Neural
Networks
Perceptrons
Multiple input nodes
Single output node
Takes a weighted sum of the inputs, call this S
Unit function calculates the output for the network
Useful to study because
We can use perceptrons to build larger networks
Perceptrons have limited representational
abilities
We will look at concepts they can’t learn later
?Why neural network
f ( x1 ,..., xn ) - unknown multi-factor decision
rule
. f ( x1 , ...,xn )
. φ(z)
.
xn z = w0 +w1 x1 ...+ +w
n nx
:-Perceptrons
Output:- using Hardlims function
Simple Example:
Categorising Vehicles
Input to function: pixel data from vehicle images
Output: numbers: 1 for a car; 2 for a bus; 3 for a tank
General Idea
(largest output value)
1.1
NUMBERS OUTPUT
NUMBERS INPUT
1.1 Cat A
2.7 7.1
4.2
3.0 0.2 Cat B
2.1
-0.8
-1.3 0.3 Cat C
-1.2
2.7
Unit Functions
Function
Linear Functions
Simply output the weighted sum
Threshold Functions
Output low values
Until the weighted sum gets over a threshold
Then output high values
Equivalent of “firing” of neurons
Step function:
Output +1 if S > Threshold T
Output –1 otherwise
Sigma function:
Similar to step function but differentiable
Learning In
Perceptron
Learning Process of ANN
Learn from
experience Compute
output
Learning algorithms
Recognize pattern of
activities
Involves 3 tasks Adjust No Is
Desired
Weight Output
Compute outputs achieved
Compare outputs
with desired targets yes
Adjust the weights
Stop
and repeat the
process
Training a
:-Perceptron
feed-forward feedback
4.1 Feed-forward networks 4.2 Feedback networks
Feed-forward ANNs allow signals to Feedback networks can have signals
travel one way only; from input to travelling in both directions by
output. There is no feedback (loops) introducing loops in the network.
i.e. the output of any layer does not Feedback networks are very powerful
affect that same layer. Feed-forward and can get extremely complicated.
ANNs tend to be straight forward Feedback networks are dynamic;
networks that associate inputs with
outputs. They are extensively used in
pattern recognition.
:-Some Topologies of ANN