Escolar Documentos
Profissional Documentos
Cultura Documentos
1
Why Neural Network (ANN)?
- Nonlinearity
- Input/Output Mapping
- Adaptivity
- Contextual information (local-global interactions)
- Massively parallel architecture
- Neurobiological analogy
Difficulties?
- ‘Big’ ground truth
- Sensitive hyper-parameters
2
3
Structural organization
Brain & NN
Neural network → Network of neurons
A neuron is an information-processing unit
McCulloch–Pitts model
4
Mathematically:
Activation potential
1 at j=0
5
Nonlinear → due to activation
function
6
Yes or No
all-or-none property:
McCulloch–Pitts
[-1 1]
Hyperbolic tangent function (S) -
8
Probabilistic /Stochastic Neuron:
McCulloch–Pitts
Neuron state:
firing (vs) not-firing
(output y)
9
Graphs representing network of neurons
11
Multi layer feed forward network
One or more hidden layers
Fully connected
Hidden Layer
computational node
Extracting higher order
representation?
Input is called
activation pattern
12
Recurrent network At least 1 feedback
No self-feedback loops
No Hidden Neurons
Self-feedback loops
Hidden Neurons
13
Assuming operation on the unit circle of in z plane,
usually considered of interest as FT is defined.
Further on 𝜔𝜔 < 1
14
as 𝜔𝜔𝑁𝑁 ≪ 1
15
Knowledge Representation
Knowledge – stored information
It is goal directed!
• Possible forms of information representation
from the inputs by the internal network
parameters are highly diverse.
Hence, difficult!
Two parts:
• Prior information
• Observation / Measurements using Sensors
16
Examples to train the network
Examples: Labeled or Unlabeled
k-fold Cross-Validation
17
• Learning can be through positive or negative examples
• Knowledge represented by free parameters like synaptic
weights (and biases)
The subject of how knowledge is actually represented inside an
artificial network is, however, very complicated.
Correlation !
How?
18
Embedding prior information
Ad-hoc!
Partially connected
feedforward network,
with same set of weights
for each neuron in a layer.
19
Receptive field of a neuron:
The region of the input (field) that influences the output
20
Embedding invariances
From image perspective:
- Rotation invariant (recognition)
- Scale invariant (recognition)
Loudness invariant (word recognition from voice)
Invariance by structure
Restrictions on the free parameters through subsidiary computations.
*VERY DIFFICULT!
Invariance by training
While training, a number of examples of the same entity with difference
in the attribute/s whose invariance is/are required.
*Computation more & how does it effect classification capability?
Invariant feature space
Inputs are feature values extract from the signal, which are invariant to
the req. signal transformation. But how to know, what feature?
21
Learning Process
Learning with a teacher
Supervised Learning
Teacher’s knowledge: Input – Output examples
Delay here
23
Unsupervised Learning
Self-organized!