Você está na página 1de 6

On Information and Knowledge Representation in the Brain

Yingxu Wang and Dong Liu


Theoretical and Empirical Software Engineering Research Centre
Dept. of Electrical and Computer Engineering
University of Calgary
2500 University Drive, NW, Calgary, Alberta, Canada T2N 1N4
Tel: (403) 220 6141, Fax: (403) 282 6855
Email: {wangyx, liud}@enel.ucalgary.ca

Abstract networks and their concurrent behaviors are extremely


powerful as a whole [6-12]. Comparing the human brain
The cognitive models of information representation and and those of other animals, the magnitude of the human
the capacity of human memory are fundamental research memory shows a significant advantage. Therefore, to
areas in cognitive informatics, which help to reveal the accurately determine the magnitude of human memory
mechanism and potential of the brain. This paper develops capacity is not only theoretically significant in cognitive
the object-attribute-relation (OAR) model for describing informatics, but also practically useful to unveil the human
information representation and storage in the brain. potential. It is also helpful to perceive the status and
According to the OAR model, the human memory and limitations of current memory and computing technologies
knowledge are represented by relations, i.e. connections of in computer science and artificial intelligence.
synapses between neurons, rather than by the neurons
themselves as the traditional container metaphor described. This paper explores the magnitude of human memory
Based on the OAR model, the memory capacity of the capacity based on a cognitive model of the brain and a set
human brain is calculated as in the order of 108,432 bits. The of mathematic and computational algorithms. Section 2
determination of the magnitude of human memory introduces the cognitive model of human memory and
capacity is not only theoretically significant in cognitive explains the mechanism of internal knowledge
informatics, but also practically useful to estimate the representation. Section 3 establishes a mathematic model
human potential, as well as the gap between the natural of memory capacity of the brain, and gives an initial
and machine intelligence. estimation of the magnitude of human memory capacity.
Section 4 develops a computational solution for
Keywords: Cognitive informatics, software engineering, calculating the memory capacity of the brain, by which the
knowledge representation, OAR model, memory capacity memory capacity is obtained to be in the order of 108,432
bits. Additional mathematic evaluation on the estimation
of the memory capacity of the brain is also provided.
1. Introduction Section 5 explains the physical and physiological
meanings of this discovery. Section 6 draws conclusions
The study of the memory capacity of human brains is a based on the finding and discusses its impact and
fundamental issue in cognitive science, neuropsychology, applications in cognitive science, neuropsychology,
and cognitive informatics. The number of neurons in an cognitive informatics, and computing science.
adult brain has been identified to be in the order of 100
billion (1011), and each neuron is connected to a large
number of other neurons via several hundreds to a few
thousands synapses [1-5]. However, the magnitude of 2. The Cognitive Model of Knowledge
memory capacity of human brains is still a mystery. This is Representation in the Brain
mainly because the estimation of this factor is highly
dependent on suitable cognitive and mathematic models of The human memory encompasses the sensory buffer
the brain, particularly how information and knowledge are memory, the short-term memory, the long-term memory
represented and stored in the memory. [3-5, 13, 14], and the action buffer memory [15]. Among
these memories, the long-term memory (LTM) is the
It is commonly understood that memory is the foundation permanent memory that human beings rely on for storing
of any natural intelligence. Cognitive scientists believe that acquired information such as facts, knowledge and skills.
the elementary function and mechanism of the brain are Although cognitive science, neurophysiology, and
quite simple, however the magnitude of the neural neuropsychology have so far not been able to determine

1
Proceedings of the Second IEEE International Conference on Cognitive Informatics (ICCI’03)
0-7695-1986-5/03 $17.00 © 2003 IEEE
what the magnitude of the LTM capacity is, it is believed
empirically that the LTM is for all intents and purposes [3, x Objects: The abstraction of external entities and
4, 7]. internal concepts. There are also sub-objects known
as attributes, which are used to denote detailed
Model 1: The functional model of LTM can be described properties and characteristics of an object.
as a set of hierarchical neural clusters with partially
connected neurons via synapses. x Relations: Connections and relationships between
object-object, object-attributes, and attribute-
The LTM model can be illustrated as shown in Fig. 1, attribute.
where the LTM consists of dynamic and partially
interconnected neural networks, where a connection Based on the above discussion, an Object-Attribute-
between a pair of neurons by a synapse represents a Relation (OAR) model of memory is derived as shown
relation between two cognitive objects. The hierarchical below.
and partially connected neural clusters are the foundation
for information and knowledge representation in LTM. Model 2. The OAR model of LTM can be described as a
triple, i.e.:

OAR ฬ <o, A, R> (1)

where o is a given object identified by an abstract name, A


is a set of attributes for characterizing the object, and R is a
set of relations between the object and other objects or
attributes of them.

An illustration of the OAR model between two objects is


Figure 1. LTM: hierarchical and partially connected shown in Fig. 2. The relations between objects can be
neural clusters established via pairs of object-object, object-attribute,
and/or attribute-attribute. The connections could be highly
Conventionally, LTM is perceived as static and fixed in complicated, while the mechanism is so simple that it can
adult brains [3-5, 13, 16]. This was based on the be deducted to the physiological links of neurons via
observation that the capacity of adult brains has already synapses in LTM.
reached a stable state and would not grow continuously.
However, latest discoveries in neuroscience and cognitive It is noteworthy as in the OAR model that the relations
informatics indicate that LTM is dynamically themselves represent information and knowledge in the
reconfiguring, particularly at the lower levels of the neural brain. The relational metaphor is totally different from the
clusters [3, 14, 15]. Otherwise, we cannot explain the traditional container metaphor in neuropsychology and
mechanisms of memory establishment, enhancement, and computer science, because the latter perceives that
evolution that are functioning everyday in the brain. memory and knowledge are stored in individual neurons
and the neurons function as containers.
Actually, the above perceptions are not contradictory. The
former notes that the macro-number of neurons in adult
brains will not increase significantly. The latter recognizes
that information and knowledge should be physically and r(O1, O2)
Object1 Object2
physiologically represented in LTM by something and in r(a11, a21)
a11 a21
some ways. Based on the latter, a cognitive model of LTM
will be developed below to explain how information or a12 a22
knowledge is represented in LTM.
a13 a23
r(O 1,

2 a )
2n

... ...
r(O ,

In contrast to the traditional container metaphor, the


a 1m)

human memory mechanism can be described by a a1i a2j


relational metaphor. The new metaphor perceives that
memory and knowledge are represented by the ... ...
connections between neurons in the brain, rather than the a1m a2n
neurons themselves as information containers. Therefore,
the cognitive model of human memory, particularly LTM,
can be described by two fundamental artefacts: Figure 2. The OAR model of memory architecture

Proceedings of the Second IEEE International Conference on Cognitive Informatics (ICCI’03)


0-7695-1986-5/03 $17.00 © 2003 IEEE
Although the number of neurons in the brain is limited, the This seems a simple problem intuitively. However, it turns
possible relations between them may result in an explosive out to be extremely hard to calculate and is almost
number of combinations that represent knowledge in the intractable using a modern computer, because the
human memory. Therefore, the OAR model is capable to exponential complicity or the recursive computational
explain the fundamental mechanisms of human memory costs for such large n and m. However, using
creation, retention, and processing. approximation theory, it can be estimated that the upper
limit of Cnm , when n = 1011 and m = 103, is in the
following order:
3. The Mathematical Model of Memory
Capacity of the Brain Cnm O(n m )
| nm (4)
According to the OAR model as shown in Fig. 2, 3
information is represented in the brain by relations, a (1011 )10
conceptual model of synapses. Hence, the capacity of 1011,000
human memory is not only dependent on the number of
neurons, but also the connections among them. This Eq. 4 demonstrates that the potential capacity of human
mechanism may result in an exponential combination to memory is bounded by 1011,000 bits. This is an initial rough
represent and store information in LTM of the brain. This estimation of the magnitude of the human memory
also explains why the magnitude of neurons in an adult capacity.
brain seems stable, however, huge amount of information
can be remembered through out the entire life of a person.

On the basis of the OAR model, assuming there are n 4. A Computational Solution to the Human
neurons in the brain, and on average there are m Memory Capacity Problem
connections between a given neuron and a subset of the
rest of them, the magnitude of the brain memory capacity In the previous section the mathematical model of the
can be expressed by the following mathematical model, human memory capacity is established, and the magnitude
the human memory capacity model, as given below: of the capacity is estimated on the order of 1011,000. This
section describes a numerical algorithm for accurately
§n· determining the capacity of human memory.
Cnm ¨ ¸
©m¹ (2)
n! The first difficulty in solving Eq. 3 is the space
m !( n  m)! complexity. To enable the large combination be processed,
a logarithm-based algorithm is developed for calculating
where n is the total number of neurons, m the number of the factorials in the memory capacity model by the
average partial connections between neurons. following equation, i.e.:
n
Eq. 2 shows that the memory capacity problem in ln(n !) ln(– i )
cognitive science and neuropsychology can be reduced to a i 1
(5)
classical combinatorial problem, with the total potential n

m ¦ ln i
relational combinations, Cn , among all neurons (n = 1011) i 1

3
and their average synapses (m = 10 ). Therefore, the
parameters of Eq. 2 can be determined as shown in Eq. 3. Although the memory overflow problem can be solved by
using Eq. 5, the time complexity in computing the sum of
3 §1011 · logarithms is still another obstacle to be overcome. To
C101011 ¨ 3¸ (3) reduce the time complexity required to resolve Eq. 3, Eq. 2
© 10 ¹
can be rewritten as follows:
1011 !
103 !(1011  103 )!
n!
Cnm
Eqs. 2 and 3 provide a mathematical explanation of the m !(n  m)! (6)
n
OAR model, which show that the number of connections
among neurons in the brain can be derived by the –
i n  m 1
i

combination of a huge base and a large number of choices. m!

Proceedings of the Second IEEE International Conference on Cognitive Informatics (ICCI’03)


0-7695-1986-5/03 $17.00 © 2003 IEEE
m n(n  m  1) 1 (11)
ln Cnm | ln  m max Ei
Transforming Eq. 6 by taking logarithm on both sides of 2 m 2 1i  m
the expression, we obtain:
3
10
n m Applying Eq. 11 to the case of C1011 , it is obtained that
(7)
ln Cnm ¦
i n  m 1
ln i  ¦ ln i
i 1 max Ei 3.98 . Thus, an improved estimation of Eq. 11
1im

after compensation is 108,600, which is more closer to the


Assuming a natural logarithm and an addition operation is result, 108,432 , as obtained by Eq. 8. This is an indirect
a unit of computation, the time complexity of Eq. 7 is proof of the accuracy of the solution derived by Eq. 8.
reduced by 2 u (1011  103 ) against that of Eq. 2. By this Also, this algorithm is more powerful to solve further
algorithm, a numeric solution of Eq. 6 is successfully and 10 10
3
efficiently obtained as C10 108,432 , i.e.: complicated combinational problems. For example, C1011
1011
can be efficiently calculated by this algorithm that results
5.9u1010
103§10 · 11
in 10 .
C ¨ 3¸
1011
© 10 ¹
1011 ! (8)
10 !(1011  103 )!
3 5. The Physical and Physiological Meaning of
108,432
the Finding

Comparing the above result with the upper-limit The previous sections of this paper unveil that the
estimation given in Eq. 4, the accuracy of this result can be magnitude of the memory capacity of the brain may reach
justified. an order as high as 108,432 bits. The magnitude of the brain
capacity is believed as one of the profound advantages of
human beings, which forms the quantitative foundation of
When m and n in the combination are getting larger than
10
the natural intelligence. The other advantage of human
10
those given in Eq. 3, for example C1011 , it will be very beings is the qualitative foundation of the brain that
difficult to find a direct numerical solution. Therefore, an possesses the abstract thinking layer based on the
additional algorithm is developed in this section based on extremely large memory capacity available in the brain.
Trapezoidal rule [19] by which the solution of a huge The finding on the magnitude of the human memory
combination can be regarded as a numerical integration capacity reveals an interesting mechanism of the brain.
problem. If a specific function can be derived to calculate That is, the brain does not create new neurons to represent
the sums of Eq. 6, the computational complexity can then new information, instead it generates new synapses
be simplified greatly. To create such an analytic function, between the existing neurons in order to represent new
linear approximation is adopted to calculate nature information. The observation in neurophysiology that the
logarithm in the estimation, i.e.: number of neurons is kept stable rather than continuous
increasing in adult brains [1-3] is an indirect evidence for
b ba the relational cognitive model of information
(9)
³
a
f ( x )dx |
2
[ f ( a )  f (b)] representation in human memory as described in this
paper.
where f ( x ) is a given function for integration, a and b are
It is interesting to contrast the memory capacities between
the beginning and end of the integration interval. modern computers and human beings. The capacity of
computer memory (mainly the hard disks) has been
By using linear approximation, the remainder can be increased dramatically in the last few decades from a few
estimated by the following method: kB to several GB (109 byte), even TB (1012 byte).
Therefore, with an intuitive metaphor that 1 neuron = 1 bit,
1 (10) optimistic vendors of computers and memory chips
E |  m max Ei
2 1im perceived that the capacity of computer memory will,
sooner or later, reach or even exceed the capacity of
Then, the reminder, E, can be used to compensate the human memory [7, 15, 18]. However, according to the
solution to Eq. 10 in order to improve its accuracy. After finding reported in this paper, the ratio, r, between the
considering the effect of E, an improved version of Eq. 10 brain memory capacity (Cb) and the projected computer
is derived below: memory capacity (Cc) in the next ten years, is as follows:

Proceedings of the Second IEEE International Conference on Cognitive Informatics (ICCI’03)


0-7695-1986-5/03 $17.00 © 2003 IEEE
r Cb C c flexible, efficient, and is capable of generating a
108,432 8 u 1012 (12) mathematically unlimited memory capacity by using
| 108,432 1013 limited number of neurons in the brain or hardware cells in
the next generation computers.
108,319

Eq. 12 indicates that the memory capacity of a human


brain is equivalent to at least 108,419 modern computers. In Acknowledgements
other words, the total memory capacity of computers all
over the world is far more less than that of a single human This work is sponsored by the Natural Sciences and
brain. Eq. 12 also shows the power of the OAR mechanism Engineering Research Council of Canada (NSERC). The
and configuration of the brain, which uses only 100 billion authors would like to acknowledge the support of NSERC,
neurons and their relational combinations to represent and and to thank Drs. Josh Leon, Ed Nowicki, Ronald
store up to 108,432 bit information and knowledge. Johnston, and the anonymous reviewers for their valuable
comments and suggestions.

The tremendous difference of memory magnitudes


between human beings and computers demonstrates the References
efficiency of information representation, storage, and
processing in the human brains. Computers store data in a [1] Marieb, E.N. (1992), Human Anatomy and
direct and unconsumed manner; while the brain stores Physiology, 2nd ed., The Benjamin/Cummings
information by relational neural clusters. The former can Publishing Co., Inc., Redwood City, CA.
be accessed directly by explicit addresses and can be
sorted; while the latter may only be retrieved by content- [2] Pinel, J.P.J. (1997), Biopsychology, 3rd. ed., Allyn
sensitive search and matching among neuron clusters and Bacon, Needham Heights, MA.
where spatial connections and configurations themselves [3] Rosenzmeig, M.R., A.L. Leiman, and Breedlove,
represent information. S.M. (1999), Biological Psychology: An Introduction
to Behavioral, Cognitive, and Clinical Neuroscience,
2nd ed., Sinauer Associates, Inc., Publishers,
6. Conclusions Sunderlans, MS.

Investigation into the cognitive models of information and [4] Smith, R.E. (1993), Psychology, West Publishing
knowledge representation in the brain, and the capacity of Co., St. Paul, MN.
the memory have been perceived to be one of the [5] Sternberg, R.J. (1998), In Search of the Human
fundamental research areas that help to unveil the Mind, 2nd ed., Harcourt Brace & Co., Orlando, FL.
mechanisms and the potential of the brain. The OAR
model developed in this paper has provided a reference [6] Gabrieli, J.D.E. (1998), Cognitive Neuroscience of
model of information representation and storage for Human Memory, Annual Review of Psychology, Vol.
computing and information sciences, as well as the IT 49, pp. 87-115.
industry. [7] Harnish, R.M. (2002), Minds, Brain, Computers: An
Historical Introduction to the Foundations of
This paper has explored the magnitude of human memory Cognitive Science, Blackwell Publishers, Ltd.,
capacity based on the OAR model and a set of mathematic Oxford, UK.
and computational algorithms. The computational solution
to the memory capacity of the brain has been obtained as [8] Kotulak, R. (1997), Inside the Brain, Andrews
in the order of 108,432 bits, which is a magnitude that is McMeel Publishing Co., Kansas City, MI.
very much higher than the total memory capacity of all [9] Leahey, T.H. (1997), A History of Psychology: Main
computers ever available in the world. The discovery of Currents in Psychological Thought, 4th ed., Prentice-
this paper has demonstrated that the magnitude of human Hall Inc., Upper Saddle River, N.J.
memory capacity is excessively higher than those of
computers in an order that we never realized. This new [10] Matlin, M.W. (1998), Cognition, 4th ed., Harcourt
factor has revealed the tremendous quantitative gap Brace College Publishers, Orlando, FL.
between the natural and machine intelligence. The finding [11] Payne, D.G. and Wenger, M.J. (1998), Cognitive
of this paper has also indicated that the next generation Psychology, Houghton Mifflin Co., New York.
computer memory systems may be built according to the
relational (OAR) model rather than the traditional [12] Turing, A.M. (1936), On Computable Numbers, with
container metaphor, because the former is more powerful, an Application to the Entscheidungs Problem,

Proceedings of the Second IEEE International Conference on Cognitive Informatics (ICCI’03)


0-7695-1986-5/03 $17.00 © 2003 IEEE
Proceedings of London Mathematical Society, Vol.2, [17] Rabin, M.O. and D. Scott (1959), Finite Automata
No.42, pp.230-265. and their Decision Problems, IBM Journal of
Research and Development, Vol.3, pp. 114-125.
[13] Baddeley, A. (1990), Human Memory: Theory and
Practice, Allyn and Bacon, Needham Heights, MA. [18] Sabloniere, P. (2002), GRID in E-Business,
Proceedings of the 8th International Conference on
[14] Squire, L.R., Knowlton, B., and Musen, G. (1993),
Object-Oriented Information Systems (OOIS’02),
The Structure and Organization of Memory, Annual
Montpellier, France, Sept., pp.4.
Review of Psychology, Vol.44, pp.453 – 459.
[19] Jordan, D. W. and P. Smith (1997), Mathematical
[15] Wang, Y., R.H. Johnston, and M.R. Smith eds.
Techniques: An Introduction for the Engineering,
(2002), Proceedings of the 1st IEEE International
Physical, and Mathematical Sciences, 2nd ed., Oxford
Conference on Cognitive Informatics (ICCI’02),
University Press, UK.
Calgary, Canada, IEEE CS Press, August.
[16] James, W. (1890), Principles of Psychology, New
York, Holt.

Proceedings of the Second IEEE International Conference on Cognitive Informatics (ICCI’03)


0-7695-1986-5/03 $17.00 © 2003 IEEE

Você também pode gostar