Escolar Documentos
Profissional Documentos
Cultura Documentos
Assignment Topic:
Discuss the advantages and disadvantages of supervised Hebbian learning.
Group No
Student Name
Student Number
R135583A
Chivhayo K
Sibanda M
Kunaka J
R0644164
R133830Q
R0645449
Module Code:
MIM803
Module Name:
Neural Networks
Lecturer:
Mrs Ruvinga
Year:
2014
Level:
2.1
Program:
Face recognition.
Page 2 of 6
Page 3 of 6
A neural network learns and does not need to be reprogrammed. Once it has been
trained the neural network becomes available 24/7 and does not forget.
Knowledge learnt can be propagated to other neural networks thereby giving scope
for cost reduction.
Using SHL a neural network can be trained to perform nonlinear analysis and tasks.
Parallelism can be built in SHL trained neural networks to enhance disaster
tolerance. This can be achieved by installing more than one similarly trained units
into one neural network. If one of the redundant units fails the other unit(s) take
over. When an element of the neural network fails, it can continue without any
problem by their parallel nature.
Multiple training algorithms are readily available for use.
SHL can use back propagation (BP) learning algorithm to solve various
classification and forecasting problems. Even though BP convergence is slow, it is
guaranteed.
The major advantage of neural networks is that they are data driven and does not
require restrictive assumptions about the form of the basic model.
Disadvantages
The neural networks that use supervised Hebbian learning need training to operate.
This training is an additional cost to the neural network. However the possible use
of samples for certain types of training helps to reduce training costs.
SHL adopts black box that exposes dependent neural networks to greater
computational burden, proneness to over-fitting, and the empirical nature of model
development. To overcome this several approaches have been combined with ANN
such as feature selection and etc.
Page 4 of 6
Despite the potential of SHL based neural network we can only get the best out of
them when they are integrated with computing, fuzzy logic and so on.
SHLs black box learning approach, cannot interpret relationships between input
and output and cannot deal with uncertainties.
SHL neural networks cannot be retrained. Even if new data is added later on it is
almost impossible to add to an existing SHL neural network functionality.
Although SHL trained neural networks often exhibit patterns similar to those
exhibited by humans, this is more of interest in cognitive sciences than for practical
examples.
Conclusion
Despite its significant disadvantages, the advantages of SHL far outweigh its
disadvantages. This will ensure SHL learning continued relevancy in neural network
applications.
References
Barto, A. B., Sutton, R. S., & Anderson, C.W. (1983). Neuron-like adaptive elements that
can solve difficult learning control problems. IEEE Trans. on Systems, Man and
Cybernetics, 13, 834846.
Cauwnberghs, G. (1993). Afast stochastic error-descent algorithm for supervised learning
and optimization. In Col. Giles, S. J. Hanson, & J. D. Cowan (Eds.), Advances in neural
information processing stystems, 5 (pp. 244251). San Mateo, CA: Morgan Kaufmann.
Chance, F. S., Abbott, L. F.,&Reyes, A. D. (2002). Gain modulation through background
synaptic input. Neuron, 35, 773782. 630 C. Swinehart and L. Abbott Chauvin, Y., &
Rumelhart, D. E., (Eds.) (1995). Back propagation: Theory, architectures, and
applications. Hillsdale, NJ: Erlbaum.
Page 5 of 6
Compte, A., Brunel, N., Goldman-Rakic, P. S., & Wang X. J. (2000). Synaptic mechanisms
and network dynamics underlying spatial working memory in a cortical network model.
Cereb. Cortex, 10, 910923.
O. Paulsen and T.J. Sejnowski, Current opinion in neurobiology 10 (2) 172 (2000).
Page 6 of 6