Você está na página 1de 18

x2 1 x1  1 x2  1.

5  0
An Example:
2 x1  x2  1.5  0

 x1 
8
1 1    1.5  0
t

1
 x2 
Sl. x1 x2 y
No
7 linearly separable
1 0.7 0.7 0 w

2 0.8 0.9 1 0 1 2
x1
3 0.8 0.25 0
4 1.2 0.8 1 x1
x1
5 0.6 0.4 0 1 1
6 1.3 0.5 1 y y
x2 1
1 =1.5
7 0.9 0.5 0 x2 -1.5
8 0.9 1.1 1 1

wt x  b  0 w  1 b  1.5 b  


1 0
x=input('x=');
sum=x(1)+x(2);
if sum>=1.5
out=1;
else out=0;
end
if out==1
disp('diseased')
else
disp('no disease')
end

x=[.6 .7]
Neural Networks
1. Neural Networks (NNs) are networks of neurons, for example, as found in
real (i.e. biological) brains.

2. Artificial Neurons are crude approximations of the neurons found in


brains. They may be physical devices, or purely mathematical constructs.

3. Artificial Neural Networks (ANNs) are networks of Artificial Neurons,


and hence constitute crude approximations to parts of real brains. They
may be physical devices, or simulated on conventional computers.

4. From a practical point of view, an ANN is just a parallel computational


system consisting of many simple processing elements connected together
in a specific way in order to perform a particular task.
Anns are more accurately described as a class of parallel algorithms.
5. One should never lose sight of how crude the approximations are, and
how over-simplified our ANNs are compared to real brains.
A neural network consists of four main parts:
1. Processing units
2. Weighted interconnections between the various processing units.
3. An activation rule which acts on the set of input signals at a unit to
produce a new output signal, or activation.
4. Optionally, a learning rule that specifies how to adjust the weights for
a given input/output pair.
Definitions

1. According to Haykin (1994), p. 2:


A neural network is a massively parallel distributed processor that has a
natural propensity for storing experiential knowledge and making it available
for use. It resembles the brain in two respects:
– Knowledge is acquired by the network through a learning process.
– Interneuron connection strengths known as synaptic weights are used to
store the knowledge.

2. According to Zurada , p. xv:


Artificial neural systems, or neural networks, are physical cellular systems
which can acquire, store, and utilize experiential knowledge.

Neural Networks are neural in the sense that they may have been inspired
by neuroscience but not necessarily because they are faithful models of
biologic neural or cognitive phenomena- Mohamad H Hasssoun

It is not absolutely necessary to believe that neural network models have


anything to do with the nervous system, but it helps. Because, we are able to
use a large body of ideas and facts from …… J.A. Anderson
Importance of ANN
• They are extremely powerful computational devices
• Massive parallelism makes them very efficient.
• They can learn and generalize from training data – so there is
no need for enormous feats of programming.
• They are particularly fault tolerant – this is equivalent to the
graceful degradation* found in biological systems. ‘you could
shoot every tenth neuron in the brain and not even notice it’
• They are very noise tolerant – so they can cope with situations
where normal systems would have difficulty.

These properties are clearly inherent in human brain

* The property that enables a system to continue operating properly in


the event of the failure of some of its components.
Brains versus Computers
1. There are approximately 10 billion neurons in the human cortex, compared
with 10 of thousands of processors in the most powerful parallel computers.
2. Each biological neuron is connected to several thousands of other neurons,
similar to the connectivity in powerful parallel computers.
3. Lack of processing units can be compensated by speed. The typical operating
speeds of biological neurons is measured in milliseconds (10-3 s), while a
silicon chip can operate in nanoseconds (10-9 s).
4. The human brain is extremely energy efficient, using approximately 10-16
joules per operation per second, whereas the best computers today use
around 10-6 joules per operation per second.
5. Brains have been evolving for tens of millions of years, computers have been
evolving for tens of decades.
6. Brain is capable of adaptation by changing the connectivity. But computer is
hard to be adaptive.
• The brain uses massively parallel
computation
– 1011 neurons in the brain
– 104 connections per neuron
Fundamental processing element of a
neural network is a neuron

• receives input signals generated


Biological neuron by other neurons through its
dendrites,
• integrates these signals in its
body,
• then generates its own signal (a
series of electric pulses) that
travel along the axon which in
turn makes contacts with
dendrites of other neurons.
• The points of contact between
neurons are called synapses.
• Incoming impulses can be
excitatory if they cause firing, or
inhibitory if they hinder the firing
of the response.
After carrying a pulse, an axon is in a state of non-excitability for a certain time
called the refractory period.
Bulletin of Mathematical Biophysics, 5, 115-133
McCulloch-Pitts neuron model (1943)

MATLAB block

Activation function
0 
Networks of McCulloch-Pitts Neurons
Perceptron
Implementation of Logical NOT, AND, and OR
Gates using perceptron
Finding Weights Analytically
XOR Gate
Homework
Is it possible to realize the following truth table
using perceptron or not? If yes, realize it

Você também pode gostar