Escolar Documentos
Profissional Documentos
Cultura Documentos
1
Proceedings of the Second IEEE International Conference on Cognitive Informatics (ICCI’03)
0-7695-1986-5/03 $17.00 © 2003 IEEE
what the magnitude of the LTM capacity is, it is believed
empirically that the LTM is for all intents and purposes [3, x Objects: The abstraction of external entities and
4, 7]. internal concepts. There are also sub-objects known
as attributes, which are used to denote detailed
Model 1: The functional model of LTM can be described properties and characteristics of an object.
as a set of hierarchical neural clusters with partially
connected neurons via synapses. x Relations: Connections and relationships between
object-object, object-attributes, and attribute-
The LTM model can be illustrated as shown in Fig. 1, attribute.
where the LTM consists of dynamic and partially
interconnected neural networks, where a connection Based on the above discussion, an Object-Attribute-
between a pair of neurons by a synapse represents a Relation (OAR) model of memory is derived as shown
relation between two cognitive objects. The hierarchical below.
and partially connected neural clusters are the foundation
for information and knowledge representation in LTM. Model 2. The OAR model of LTM can be described as a
triple, i.e.:
2 a )
2n
... ...
r(O ,
On the basis of the OAR model, assuming there are n 4. A Computational Solution to the Human
neurons in the brain, and on average there are m Memory Capacity Problem
connections between a given neuron and a subset of the
rest of them, the magnitude of the brain memory capacity In the previous section the mathematical model of the
can be expressed by the following mathematical model, human memory capacity is established, and the magnitude
the human memory capacity model, as given below: of the capacity is estimated on the order of 1011,000. This
section describes a numerical algorithm for accurately
§n· determining the capacity of human memory.
Cnm ¨ ¸
©m¹ (2)
n! The first difficulty in solving Eq. 3 is the space
m !( n m)! complexity. To enable the large combination be processed,
a logarithm-based algorithm is developed for calculating
where n is the total number of neurons, m the number of the factorials in the memory capacity model by the
average partial connections between neurons. following equation, i.e.:
n
Eq. 2 shows that the memory capacity problem in ln(n !) ln( i )
cognitive science and neuropsychology can be reduced to a i 1
(5)
classical combinatorial problem, with the total potential n
m ¦ ln i
relational combinations, Cn , among all neurons (n = 1011) i 1
3
and their average synapses (m = 10 ). Therefore, the
parameters of Eq. 2 can be determined as shown in Eq. 3. Although the memory overflow problem can be solved by
using Eq. 5, the time complexity in computing the sum of
3 §1011 · logarithms is still another obstacle to be overcome. To
C101011 ¨ 3¸ (3) reduce the time complexity required to resolve Eq. 3, Eq. 2
© 10 ¹
can be rewritten as follows:
1011 !
103 !(1011 103 )!
n!
Cnm
Eqs. 2 and 3 provide a mathematical explanation of the m !(n m)! (6)
n
OAR model, which show that the number of connections
among neurons in the brain can be derived by the
i n m 1
i
Comparing the above result with the upper-limit The previous sections of this paper unveil that the
estimation given in Eq. 4, the accuracy of this result can be magnitude of the memory capacity of the brain may reach
justified. an order as high as 108,432 bits. The magnitude of the brain
capacity is believed as one of the profound advantages of
human beings, which forms the quantitative foundation of
When m and n in the combination are getting larger than
10
the natural intelligence. The other advantage of human
10
those given in Eq. 3, for example C1011 , it will be very beings is the qualitative foundation of the brain that
difficult to find a direct numerical solution. Therefore, an possesses the abstract thinking layer based on the
additional algorithm is developed in this section based on extremely large memory capacity available in the brain.
Trapezoidal rule [19] by which the solution of a huge The finding on the magnitude of the human memory
combination can be regarded as a numerical integration capacity reveals an interesting mechanism of the brain.
problem. If a specific function can be derived to calculate That is, the brain does not create new neurons to represent
the sums of Eq. 6, the computational complexity can then new information, instead it generates new synapses
be simplified greatly. To create such an analytic function, between the existing neurons in order to represent new
linear approximation is adopted to calculate nature information. The observation in neurophysiology that the
logarithm in the estimation, i.e.: number of neurons is kept stable rather than continuous
increasing in adult brains [1-3] is an indirect evidence for
b ba the relational cognitive model of information
(9)
³
a
f ( x )dx |
2
[ f ( a ) f (b)] representation in human memory as described in this
paper.
where f ( x ) is a given function for integration, a and b are
It is interesting to contrast the memory capacities between
the beginning and end of the integration interval. modern computers and human beings. The capacity of
computer memory (mainly the hard disks) has been
By using linear approximation, the remainder can be increased dramatically in the last few decades from a few
estimated by the following method: kB to several GB (109 byte), even TB (1012 byte).
Therefore, with an intuitive metaphor that 1 neuron = 1 bit,
1 (10) optimistic vendors of computers and memory chips
E | m max Ei
2 1im perceived that the capacity of computer memory will,
sooner or later, reach or even exceed the capacity of
Then, the reminder, E, can be used to compensate the human memory [7, 15, 18]. However, according to the
solution to Eq. 10 in order to improve its accuracy. After finding reported in this paper, the ratio, r, between the
considering the effect of E, an improved version of Eq. 10 brain memory capacity (Cb) and the projected computer
is derived below: memory capacity (Cc) in the next ten years, is as follows:
Investigation into the cognitive models of information and [4] Smith, R.E. (1993), Psychology, West Publishing
knowledge representation in the brain, and the capacity of Co., St. Paul, MN.
the memory have been perceived to be one of the [5] Sternberg, R.J. (1998), In Search of the Human
fundamental research areas that help to unveil the Mind, 2nd ed., Harcourt Brace & Co., Orlando, FL.
mechanisms and the potential of the brain. The OAR
model developed in this paper has provided a reference [6] Gabrieli, J.D.E. (1998), Cognitive Neuroscience of
model of information representation and storage for Human Memory, Annual Review of Psychology, Vol.
computing and information sciences, as well as the IT 49, pp. 87-115.
industry. [7] Harnish, R.M. (2002), Minds, Brain, Computers: An
Historical Introduction to the Foundations of
This paper has explored the magnitude of human memory Cognitive Science, Blackwell Publishers, Ltd.,
capacity based on the OAR model and a set of mathematic Oxford, UK.
and computational algorithms. The computational solution
to the memory capacity of the brain has been obtained as [8] Kotulak, R. (1997), Inside the Brain, Andrews
in the order of 108,432 bits, which is a magnitude that is McMeel Publishing Co., Kansas City, MI.
very much higher than the total memory capacity of all [9] Leahey, T.H. (1997), A History of Psychology: Main
computers ever available in the world. The discovery of Currents in Psychological Thought, 4th ed., Prentice-
this paper has demonstrated that the magnitude of human Hall Inc., Upper Saddle River, N.J.
memory capacity is excessively higher than those of
computers in an order that we never realized. This new [10] Matlin, M.W. (1998), Cognition, 4th ed., Harcourt
factor has revealed the tremendous quantitative gap Brace College Publishers, Orlando, FL.
between the natural and machine intelligence. The finding [11] Payne, D.G. and Wenger, M.J. (1998), Cognitive
of this paper has also indicated that the next generation Psychology, Houghton Mifflin Co., New York.
computer memory systems may be built according to the
relational (OAR) model rather than the traditional [12] Turing, A.M. (1936), On Computable Numbers, with
container metaphor, because the former is more powerful, an Application to the Entscheidungs Problem,