Você está na página 1de 66

Response of the Brain to Enrichment

by Marian Cleeves Diamond

Abstract:
Before 1960, the brain was considered by scientists to be immutable, subject only to genetic
control. In the early sixties, however, investigators were seriously speculating that
environmental influences might be capable of altering brain structure. By 1964, two research
laboratories proved that the morphology and chemistry of the brain could be experientially
altered (Bennett et al. 1964; Hubel and Wiesel 1965). Since then, the capacity of the brain to
respond to environmental input, specifically "enrichment," has become an accepted fact among
neuroscientists, educators and others. In fact, the demonstration that environmental
enrichment can modify structural components of the rat brain at any age altered prevailing
presumptions about the brain's plasticity. (Diamond et al. 1964; Diamond et al. 1985). The
cerebral cortex, the area associated with higher cognitive processing, is more receptive than
other parts of the brain to environmental enrichment. The message is clear: Although the brain
possesses a relatively constant macrostructural organization, the ever-changing cerebral cortex,
with its complex microarchitecture of unknown potential, is powerfully shaped by experiences
before birth, during youth and, in fact, throughout life. It is essential to note that enrichment
effects on the brain have consequences on behavior. Parents, educators, policy makers, and
individuals can all benefit from such knowledge.

Introduction
Can experience produce measurable changes in the brain? The hypothesis that
changes occur in brain morphology as a result of experience is an old one. In
1815 Spurzheim asked whether organ size could be increased by exercise. He
reported that the brain as well as muscles could increase with exercise "because
the blood is carried in greater abundance to the parts which are excited and
nutrition is performed by the blood." In 1874 Charles Darwin mentioned that
the brains of domestic rabbits were considerably reduced in bulk in comparison
with those from the wild because, as he concluded, these animals did not exert
their intellect, instincts, and senses as much as did animals in the wild.
However, it was not until the 1960s, that the first controlled studies in animals
demonstrated that enriching the environmental condition in which they were
confined could alter both the chemistry and anatomy of the cerebral cortex and,
in turn, improve the animals' memory and learning ability. In these early
experiments only the brains of young animals were studied. Although many
were impressed to learn that the cerebral cortex could increase its thickness in
response to enriched living conditions, they raised the question about whether
enrichment might similarly affect older animals. Once middle-aged rats brains
showed positive responses to enrichment, the next step was to experiment with
very old animals. Once again, increases in cortical thickness were found It then
became important to discover what was responsible for these changes. One step
at a time, the level of morphological changes -- from neuronal soma size, to
number and length of dendrites, to types and numbers of dendritic spines, to

synaptic thickening, to capillary diameter, and to glial types and numbers -- was
examined. Age, gender, duration of exposure, etc. were critical variables that
had to be tested in new experiments.
Most of the basic data reported on the enrichment paradigm and its impact on
brain and behavior have accumulated through studies on the rat. Effects of
enriched and impoverished environments on the nerve cells and their
neurotransmitters in the cerebral cortex have now been generalized to several
mammalian and avian species (Rosenzweig and Bennett, 1996). Some
corroborating studies mentioned herein involved cats and monkeys, as well as
isolated studies in human subjects. For example, Jacobs et al. (1993) using an
isolated portion of the human cerebral cortex responsible for word
understanding, Wernicke's area, compared the effects of enrichment in tissue
from deceased individuals who had had a college education and from those who
had had only a high school education. They demonstrated that the nerve cells in
the college-educated showed more dendrites than those in the latter. (Tissue
was obtained from the Veteran's Hospital in west Los Angeles.) Experiments on
human tissue frequently support the data obtained from studies in the rat, and,
in turn, benefit from these animal studies. We can now safely say that the basic
concept of brain changes in response to enrichment hold true for a wide variety
of animals and for humans.
The effects of enrichment on the cerebral cortex
What do we mean by "enrichment" for the rats who have served as the animal of
choice for most of these studies? Thirty six Long-Evans rats were sorted into
three experimental conditions using 12 animals in each group: 1) enriched
2)standard or 3) impoverished environments. All animals had free access to
food and water and similar lighting conditions. Eventually, it was determined
that animals maintained in their respective environments from the age of 30
days to 60 days developed the most extensive cerebral cortical changes. For the
enriched environment, the 12 animals lived together in a large cage (70 x 70 x
46 cm) and were provided 5-6 objects to explore and climb upon (e.g., wheels,
ladders, small mazes). The objects were changed two to three times a week to
provide newness and challenge; the frequent replacement of objects is an
essential component of the enriched condition. The combination of "friends"
and "toys" was established early on by Krech as vital to qualify the experiential
environment as "enriched." (Krech et al., 1960). For the standard environment,
the animals were housed 3 to a small cage (20 x 20 x 32 cm) with no exploratory
objects. For the impoverished environment, one animal remained alone in a
small cage with no exploratory objects. The numbers of animals placed in these
separate conditions were based on the manner in which the routine housing was
established in the rat colony. Three rats in a cage has been considered standard
for all experimental work over the decades. Since prior to these experiments no
one had designed studies to examine brain changes in response to different
environmental conditions, the decisions about what represented
"impoverishment" and what represented "enrichment" was more arbitrarily
than scientifically reasoned.
After 30 days in their respective environments, all animals were anesthetized
before the brains were removed for comparison among the three groups. Twenty

micra frozen sections were cut and stained, and the thickness of the frontal,
parietal and occipital cortices were measured. Results indicated clearly that the
cortex from the enriched group had increased in thickness compared with that
living in standard conditions, whereas, the brains from the impoverished group
decreased compared to the standard. Because the nerve cells were farther apart
in the enriched versus the impoverished brains, it was thought that the major
component of the brain changes due to enrichment had to do with alterations in
the dendritic branching. . With more detailed studies, the cortical thickness
increases were found to be due to several factors, including increased nerve cell
size, number and length of dendrites, dendritic spines, and length of
postsynaptic thickening as measured on electron microscopic pictures of
synapses (Diamond et al, 1964 and 1988).
In the initial experiments designed to explore the impact of an enriched
environment on the brain of post-weaned rats, only enriched and impoverished
groups were used. Rats were maintained in their respective environments from
25 to 105 days of age because there were no available data on how long it would
take to create chemical or structural changes in the cortex. Chemical and
anatomical measurements taken from these animals showed significant
differences between the two groups-- in cortical thickness, cortical weight,
acetylcholinesterase, cholinesterase, protein and hexokinase levels, (Bennett et
al. 1964., Diamond 1964). In these initial experiments, however, it was not clear
if the changes were due to enrichment or impoverishment because there were
no standard conditions established as controls.
Nonetheless, the differences in cortical thickness with this 80-day exposure to
the two environmental conditions were not as great as during the 30-day
exposure. Consequently, in subsequent experiments, the period of exposure to
the experimental conditions was reduced from 80 days to 30 days, then 15 days,
7 days and finally to 4 days. At each of these intervals, animals from the
enriched environment showed increases in cerebral cortical thickness in some
areas but not in others. For example, in the male animals exposed for 80 days to
enriched conditions, the somatosensory cortex did not show significant changes,
whereas male animals exposed for 30 days did develop significant differences in
the somatosensory cortex. The occipital cortex showed significant changes for
both the 80- and the 30-day experiments, but, again, the differences were
greater at 30 days than at 80 days. It is possible that the longer exposure served
to increase cortical thickness in the early days of enrichment but that over time
the environmental condition became monotonous and this effect decreased.
In later experiments the experimental conditions were modified to try to
establish what the major factors were that created the observed cortical changes.
For example, was the effect associated with the number of rats exposed or to the
presence of stimulus objects? The new conditions included one rat living alone
in the large enrichment cage with the objects that were changed several times
each week. The cortex of these rats did not show a significant effect of
enrichment. Twelve rats living together in the large cage without the stimulus
objects did not show as great an effect as 12 rats living with the stimulus objects.
In other words, the combination of social conditions and frequent exposure to
new stimulus objects were necessary for the animals to gain the full effect of
enrichment.

Establishing what constitutes "enrichment" for human beings is more


problematic. Not only are controlled experiments not feasible, but no two
human brains are identical. Individuals differ in their genetic backgrounds and
environmental inputs. Furthermore, what is considered enrichment for one
individual may be quite different for another. Yet, as mentioned earlier, the
enrichment effect was evident in Wernicke's area from measurements of the
amount of dendritic branching in brain tissue from college-educated individuals
versus that from high school-educated people. The basic finding of dendritic
growth in response to environmental stimulation appears in all brains studied
to date. It would appear that newness and challenge are important for the
human cortex as well as for that of animals.

Figure 1. Two possible patterns of age-related alterations in cortical pyramidal


cells. The normal mature neuron (A) may show regressive dendritic changes
characterized by loss of basilar dendritic branches and eventual loss of the
entire dendritic tree (D, E, F). Other neurons (B, C) may show progressive
increase in dendritic branching. Drawing based on Golgi impregnations.

Independent variables: age and gender


Among the many variables researchers must consider as they seek to
understand and accurately interpret the effects of enrichment on the brain, age
and gender are important considerations. Enrichment has been shown to
enhance many aspects of cortical structure at any age--from prenatal to
extremely old rats (904 days of age). The amount of change varies with the age
of the animal. For example, when a 30-day-old rat is put in an enriched
environment for four days, the effects are not as pronounced as they are in the
60-day-old-rat maintained in enriched conditions for four days. Is four days too
short a time for the very young animal to adjust and benefit from enrichment? A
young animal maintained for 30 days in an impoverished environment shows
reduced morphological development of its cortex when compared to that of an
adult animal maintained in impoverished conditions for 30 days. In further agerelated experiments, another component was added to the enrichment
conditions of old rats. Despite significant increases in the length of the dendrites
in the brains of 600-day-old rats that had been placed in an enriched
environment for 30 days (600 to 630 days), several of the old rats in this
population died. To determine whether the enrichment conditions could be
modified to extend the animals' life span, the investigators added a new
component: hand-holding the rats each day for several minutes while the cages
were cleaned. In an attempt to increase the life span of the rats, rats were placed
three to a cage after weaning at 25 days of age, and maintained in these
standard conditions until they reached 766 days, at which time half went into
enriched conditions until they reached 904 days of age and half stayed in the
standard conditions. The only variable added was the daily hand-holding of the
rats as they aged. Is it possible that handling the rats had extended their life
span? Indeed, many investigators have been amazed that these rats survived to
904 days of age. The 904 day-old rats in enriched conditions developed a cortex
significantly thicker than the cortex of rats living in the standard conditions
(Diamond 1988). These experiments offered support to the thesis that the
cerebral cortex is capable of responding positively to an enriched environment
at any age.
Experiments comparing the effects of enrichment on male and female brains are
few. Most enrichment studies have been carried out on male brain to avoid the
compounding factors associated with the estrous cycle. In one study focused on
gender, the female neocortex was found to respond differently from the male
neocortex exposed to the same type of enrichment conditions (Diamond 1988).
The male showed significant changes in cortical thickness in the occipital cortex,
but no significant changes in the somatosensory cortex. (Although the right
cerebral cortex in the brain of the male rat is thicker than the left, especially in
the visual or occipital region, an enriched environment appears to alter both the
right and left cortex similarly.) In the female, the thickness of the occipital
cortex increased significantly in response to enrichment, although not as much
as in the male, but the thickness of the somatosensory cortex increased
significantly more in the female than in the male. In a follow-up experiment,
however, in which obstacles were piled up in front of the female food cup to
provide a greater challenge to her already enriched environment, the thickness

of the occipital cortex increased as much as did that of the male without the
additional challenge. In rats whose testes were removed either at birth or at 30
days of age before the rats were placed in an enriched environment for 30 days,
the increases observed in cortical thickness were similar to those of their
littermates with intact testes. (Diamond, 1988) These findings suggested that
testosterone is not implicated in the increases in cortical thickness observed in
the brains of rats living in enriched environments. Since sex differences were
evident in the responses of the animals to enrichment, interest was now focused
on the brains of pregnant rats, in which the concentrations of sex steroid
hormonal concentrations are greatly altered. The brains of female rats living in
the enriched environment from 60 to 90 days and then becoming pregnant and
returning to enrichment until 116 days of age were compared between nonpregnant and pregnant animals living in an impoverished environment for the
same time periods. When animals from the two groups were autopsied at 116
days, no significant differences in cortical thickness were found. Evidently,
pregnancy has an effect on the cerebral cortex regardless of whether the
environment is impoverished or enriched. These initial experiments, all of
which were replicated, clearly indicate gender differences in the brain's
response to enrichment.
Dependent variables
Having dealt with the independent variables, we turn to the impact of
dependent variables in the enrichment paradigm. For these studies, one must
look at: duration of exposure, brain anatomy and chemistry, presence of lesions
or fetal neocortical grafts, negative air ions, stress, physical activity and
nutrition, as well as behavioral effects. These are discussed in turn below.

Duration
The duration of exposure to the enriched environment is clearly a
significant dependent variable that must be factored into research in this
area. As short a period as 40 minutes of enrichment has been found to
produce significant changes in RNA and in the wet weight of cerebral
cortical tissue sampled. One day of enrichment was insufficient to
produce measurable changes in cortical thickness, whereas four
consecutive days of exposure (from 60 to 64 days of age) to an enriched
environment did produce significant increases in cortical thickness, but
only in the visual association cortex (area 18) (Diamond 1988). When
young adult rats were exposed to 30 days of enrichment, however, the
entire dorsal cortex, including frontal, parietal and occipital cortices,
increased in thickness. Extending the duration of the stay in enriched
conditions to 80 days did not produce any greater increase in cortical
thickness than that seen at 30 days (in fact, it was often even less);
however, the longer the rat remained in the enriched conditions, the
longer the cortex retained its increased dimensions following return to
the standard environment (Bennett et al, 1974). When we looked at agerelated differences in the context of duration of stay in the enriched
environment, we found that old rats ( 766 days of age) placed in enriched
conditions for 138 days showed an increase in cortical thickness that was
quite similar to that observed in young adult rats (60 days of age) that
had lived in enriched conditions for 30 days.

Anatomical
and
chemical
components
Early experiments, and those to follow in subsequent years, again
demonstrated significant differences in brain chemistry and anatomy
associated with enriched living conditions. Anatomical increases include
all of the structural constituents measured in the cerebral cortex to date,
such as cortical thickness (Diamond et al. 1964), nerve cell soma size,
nerve cell nuclear size (Diamond, 1988), dendritic dimensions (Holloway
1966; Greenough et al. 1973), dendritic spines, synaptic size and number
(Mollgaard et al 1971; Black et al 1990), number of glia, capillary
diameter (Diamond 1988), dendritic number after lesions (McKenzie
1990), and successful tissue grafts, (Mattsson et al. 1997). Chemical
increases include: total protein, RNA-to-DNA ratio, cholinesterase-toacetylcholine ratio, Nerve Growth Factor mRNA, cyclic AMP, choline
acetyltransferase, cortical polyamines, NMDA (N Methyl D Aspartate)
receptors, and hexokinase, etc.

Lesions
Another variable has to do with the impact of enriched conditions on
purposefully incurred brain lesions. In a 1990 study, 60-day-old rodents
were exposed for 30 days to either an enriched or standard environment
two days after having received a lesion in the left frontal cortex that
created a motor dysfunction in the right forepaw. Animals living in the
enriched condition showed significant increases in cortical dendritic
branching in both hemispheres, the lesioned and the non-lesioned sides,
along with a significant return of motor function in the right forepaw
compared to those animals living in standard conditions.(McKenzie
1990).

Fetal
neocortical
graft
Similarly, providing an enriched environment to rats that had undergone
fetal neocortical grafts one week after lesioning was found to improve
behaviorally and to reduce the atrophy in the thalamus, a major structure
beneath the cortex that supplies neural input to the cortex (Mattsson et
al. 1997). The fact that the fetal neocortical graft when placed in the
lesioned cerebral cortex could prevent atrophy in the underlying
thalamus as a consequence of enrichment is of great interest to
researchers considering the future possibility of using such grafts for
brain-damaged individuals.

Air
ions
The possibility that physical environmental stimuli other than those
classically regarded as "sensory" could have an effect on the brain was
tested experimentally by exposing rats living in enriched or standard
environments to high concentrations of negative air ions. The
experiments were undertaken to determine whether the effect of negative
ions on serotonin, the putative second messenger cyclic-AMP, and on
cyclic GMP in the cerebral cortex, differ depending on whether the
animals lived in enriched or standard conditions. Studies demonstrated
that rats placed in the enriched environment in the presence of enhanced
negative air ions (ion density of 1 X 105) showed a significant decrease in
serotonin, an effect not found in the brains of animals living in standard

conditions (Diamond et al. 1980). Measurements of cyclic AMP


decreased as well in the brains of the animals living in the enriched
conditions, but cyclic GMP did not. These results indicate the importance
of considering air quality and atmospheric conditions in determining the
brain's response to enrichment.

Stress
The presence or absence of stress represents yet another variable to be
taken into consideration in such studies, certainly so in any extrapolation
of these findings to humans. Stress is a major factor in contemporary,
fast-moving urban life. Crowding, for example, is deemed stressful under
conditions where competition for space or food is likely. Experiments
were set up to assess the effect of crowding on the brains of rats
maintained in an enriched environment. To create a condition in which
crowding would be experienced as stressful, 36 rats were placed in an
enrichment cage usually housing only 12 rats, and kept there for 30 days.
The results indicated that, compared with rats living in standard
conditions, the thickness of the medial occipital cortex increased
significantly whether the enrichment cage housed 12 or 36 animals,
(Diamond et al. 1987). One hypothesis to come from this study was that
the animals' interaction with the toys might be diverting their attention
or entertaining them sufficiently to mitigate the stress of the crowded
condition. Chronic stress has been reported by Meaney et al. (1988) to
produce excess glucocorticoids, which are toxic to neurons--especially
those of the hippocampus. Aged rats are particularly vulnerable to
chronic stress. The investigations of Meaney showed that enriching the
living conditions of old rats, or handling them in their infancy, helps to
prevent stress-related hippocampal damage. It is possible that stress can
be produced by increasing the frequency with which the various objects
in the enrichment cage are changed. In all previous studies, objects had
been replaced daily or at least several times each week. Then the question
was asked whether increasing the frequency of changing the objects
would further increase the growth of the cortical thickness, or,
alternatively, would it be experienced as a stress factor, given that the
animals were inhibited from interacting with them in the more leisurely
manner to which they were accustomed. For these experiments, rats 60
to 90 days of age found their objects changed every hour for three hours
on four nights of each week for four consecutive weeks. Under this
regime, the cerebral cortical thickness did not grow significantly
compared to cortices from rats whose objects were changed several times
each week for four weeks (Diamond, unpublished.) Corticosteroids,
released under stress, have been shown to reduce cortical thickness and
future experiments would be necessary to compare differences in
corticosteroid levels in animals exposed to these differing conditions.

Behavior
Psychologists have known for a long time that early experience influences
the adult performance of an animal. In experiments in the 1950s
(Bingham et al 1952 and Forgays and Forgays 1952) investigators were
interested in determining how much experience in complex
environments was necessary to produce a highly intelligent adult animal

and when, specifically, during early life these experiences had to occur.
These studies showed that all of the animals maintained in enriched
conditions were better problem solvers than those with no enrichment;
however, in some other occasions, using other tests, enriched rats did not
perform significantly better than controls. One of the most robust effects
of environmental enrichment on the behavior of rats appears in the areas
of learning and memory. Investigators (York et al 1989 and Kempermann
et al 1997) studying the effects of enrichment in the rat brain have
reported that new nerve cells develop in the adult dentate gyrus, an area
dealing with recent memory processing. In the York experiments the rats
were 60 to 90 days of age (truly adult animals) during the enrichment
experience, whereas in the Kempermann experiments the mice were 21
to 40 days of age. These finding are significant because neurogenesis had
not previously been found in the cerebral cortex of the mammalian adult.
Earlier studies had found that enriched environments stimulate the
growth of dendrites in the dentate gyrus, and only in female rats.
(Juraska et al 1985).

Physical
Activity
One component of enrichment is the physical exercise involved in the
animals' having to move about the cage, interacting with and climbing
upon the novel objects. These activities appear to influence the motor
cortex as well as the hippocampus. Olsson et al. (1994) showed that rats
living in enriched environments at 50 days of age showed higher
expression of the gene-encoding glucocorticoid receptors and induction
of genes for Nerve Growth Factors in the hippocampus.

Nutrition
Nutrition is clearly an important variable to consider in all studies
dealing with brain and behavior. Environmental enrichment and
impoverishment have pronounced effects on nutritionally deficient
animals. One study compared the effects of environmental enrichment
on the offspring of mother rats living on protein-rich or protein-deficient
diets during pregnancy (Carughi et al. 1990). The protein-rich diet
proved beneficial for the healthy development of the cerebral cortical
dendrites in young rats and even more so when combined with an
enriched environment. The cerebral cortical dendrites in rat pups from
mothers with a protein-deficient diet were significantly less well
developed than those of their counterparts, but, of greater importance,
the cortex from the protein-deficient animals did not significantly
increase with enrichment. On the other hand, when protein-deficient
pups were fed a protein-rich diet and maintained in an enriched
environment during their early postnatal life, cortical development
improved almost to the level seen in rat pups from mothers on a highprotein diet during pregnancy followed by postnatal enrichment. These
data are very encouraging, because they suggest the possibility of making
up for lost brain growth during pregnancy by enriching both the diet and
the environmental conditions during the postnatal period. Another
dietary factor significant to optimal brain function is glucose. The brain
depends almost exclusively on glucose for its energy. Synapses use a
great deal of energy and glucose supplies this energy. Although we know

that different parts of the brain use glucose at different rates, to learn
which of 30 discrete brain regions were most active in adult rats placed in
enriched living conditions from 57 to 87 days of age, we studied their
radioactive glucose uptake during this 30-day period and compared it
with that of rats raised in standard conditions (Diamond et al 1988).
Again, the cerebral cortex showed the greatest differences between
enriched and non-enriched groups, but, surprisingly, of the two groups,
glucose uptake was lower in rats maintained in enriched conditions. We
concluded from this finding that glucose uptake is more efficient in the
brain of animals living in enriched environments. Out of the 30 areas of
the brain measured, including the cortex, only one area showed
significantly greater glucose uptake in the enriched animals: the corpus
callosum, specifically, the large mass of axons connecting the nerve cells
between the two cerebral hemispheres. Could the axons forming the
corpus callosum from the nerve cells in the cerebral cortex be more active
than the nerve cell bodies from which they arise? Yet the right and left
cerebral cortices show comparable cortical thickness increases with
enrichment due to the effects on dendritic branching, but now the data
show that the rates of glucose utilization in both the frontal and parietal
cortices were 13% lower in the enriched rats than in the standard control
rats, a paradox to be untangled in the future.
Methodological issues associated with enrichment research in
humans
Of the vast number of animal studies that yield results of interest to human
research, studies on the impact of an enriched environment on brain
development and behavior can be of enormous interest to humans. Despite
similarities in some key respects between the brain of the rat and other
mammals, replicating or extrapolating from anatomical and chemical studies
conducted in animals is fraught with difficulty, for obvious reasons. Not only is
it not presently possible to control all of the experimental variables at work in
humans, but the diversity and complexity of human experience militates against
designing experiences comparable to those used with lower animals.
Nevertheless, these studies and what few human studies have been done,
suggest that there are measurable benefits to enriching an individual's
environment in whatever terms that individual perceives his immediate
environment as enriched. At the very least, this work indicates that there are
many opportunities for enhancing brain activity and behavior at all ages, and
that they can have pronounced effects throughout the life span.

References
Bennett E L, Diamond M C, Krech D, Rosenzweig M R 1964 "Chemical and Anatomical
Plasticity of the Brain". Science 164:610-619
Bennett E L, Rosenzweig M R, Diamond M C 1974 "Effects of successive environments on brain
measures." Physio. and Behavior 12:621-631

Bingham W E, Griffiths W J 1952 "The effect of different environments during infancy on adult
behavior in the rat." J. Comp. Physiol. Psychol. 45:307-312
Black, J E, Isaacs K R, Anderson B J, Alcantara A A, Greenough, W T 1990 "Learning causes
synptogenesis, whereas motor activity causes angiogenesis, in cerebellar cortex of adult
rats." Proc. Nat. Acad. Sci. 87:5568-5572
Carughi A, Carpenter K J, Diamond M C 1990 "The Developing Cerebral Cortex: nutritional and
environmental influences." Malnutrition and the Infant Brain. Wiley-Liss p.127-139
Darwin C 1874 The Descent of Man, Rand McNally, Chicago ed 2
Diamond M C, Krech D, Rosenzweig M R 1964 "The effects of an Enriched Environment on the
Rat Cerebral Cortex." J. Comp. Neurol. 123:111-119
Diamond M C, Connor J R, Orenberg E K, Bissell M, Yost M, Krueger A 1980 "Environmental
Influences on Serotonin and Cyclic Nucleotides in Rat Cerebral Cortex."Science 210:652-654
Diamond M C, Greer E R, York A, Lewis D, Barton T, Lin J 1987 "Rat Cortical Morphology
Following Crowded-Enriched Living Conditions." Exp. Neurol. 96:241-247
Diamond M C, 1988 Enriching Heredity, The Free Press, New York
Forgays, G Forgays J 1952 "The nature of the effect of free environmental experience in the
rat." J. Comp. Physiol. Psychol 45:322-328
Greenough, W T, Volkman R and Juraska J M 1973 "Effects of rearing complexity on dendritic
branching in fronto-lateral and temporal cortex of the rat." Exp. Neurol41:371-378
Holloway, R L 1966 "Dendritic branching: some preliminary results of training and complexity
in rat visual cortex." Brain Res. 2:393-396
Hubel D H and Wiesel T N 1965 "Binocular Interaction in Striate Cortex of Kittens Reared with
Artificial Squint," J. Neurophysiol. 28:1041-1059
Jacobs B, Schall M, Scheibel A B 1993 "A Quantitative Dendritic Analysis of Wernicke's Area in
Human. II. Gender, Hemispheric, and Environmental Changes." J. Comp. Neurol. 327:97-111
Juraska J M, Fitsch J M, Henderson C, Rivers N 1985 "Sex differences in the Dendritic
Branching of Dentate Granule Cells Following Differential Experience." Brain Res.333:73-80
Kempermann G, Kuhn H G, Gage F H 1997 "More Hippocampal Neurons in Adult Mice Living
in an Enriched Environment." Nature 386: 493-495
Krech D, Rosenzweig M R , Bennett E L 1960 "Effects of environmental complexity and training
on brain chemistry." J. Comp. Physiol. Psychol. 53:509-519
Mattsson B, Sorensen J C, Zimmer J, Johansson B B 1997 "Neural Grafting to Experimental
Neocortical Infarcts Improves Behavioral Outcome and Reduces Thalamic Atrophy in Rats
Housed in Enriched but not Standard Environments." Stroke June 2 (6) 1225-1231
McKenzie A, Diamond M C, Greer E R, Woo L, Telles T 1990 "The Effects of Enriched
Environment on Neural Recovery Following Lesioning of the Forelimb Area of Rat Cortex." Am.
Physical Therapy Annual Conf. Anaheim, CA

Meaney M J, Aitkin D H, Bhatnagar S, Van Berkel C, Sapolsky R M 1988 "Postnatal Handling


Attenuates Neuroendocrine, Anatomical and Cognitive Impairments Related to the Aged
Hippocampus." Science 283:766-768
Mollgaard, K, Diamond, M C, Bennett E L, Rosenzweig M R, and Lindner B 1971 "Quantitative
synaptic changes with differential experience in rat brain." Int. J. Neurosci. 2:113-128
Olsson T, Mohammed A H, Donaldson L F, Henriksson B G, Seckl J R 1994 "Glucocorticoid
Receptor and NGFI-A gene Expression Are Induced in the Hippocampus After Environmental
Enrichment in Adult Rats." Mol. Brain Res. 23:349-353
Rampon C, Jiang C H, Dong H, Tang Y-P, Lockhart D J, Schultz P G, Tsien J Z, Hu, Y 2000
"Effects of environmental enrichment on gene expression in the brain." Proc. of the Nat. Acad.
of Sci. of USA 97(23):12880-12884
Rosenzweig M R, Bennett E L 1996 "Psychobiology of Plasticity: Effects of Training and
Experience on Brain and Behavior." Behav. Brain Res. 78:57-65
Spurzheim J C 1815 The Physiognomical System of Drs Gall and Spurzheim. Baldwin Cradock
and Joy, London ed 2: 554-555
York A D, Breedlove S M, Diamond M C, 1989" Housing Adult Male Rats in Enriched Conditions
Increases Neurogenesis in the Dentate Gyrus." Soc. for Neurosci. Abstracts #383.11 p. 962

About the Author:


Dr. Marian C. Diamond is professor of anatomy and one of the world's foremost
neuroanatomists. She is author of more than 100 scientific articles and three books,
including Enriching Heredity (Free Press/Simon and Schuster, 1988). You can reach Marian
Diamond, Ph.D., at the Department of Integrative Biology, 3060 Valley Life Sciences Building,
University of California, Berkeley 94720, USA., or by e mail: diamond@socrates.berkeley.edu.

The Significance of Enrichment


The following article is taken from Enriching Heredity, by Marian Cleeves Diamond, copyright
(c) 1988 by Marian Cleeves Diamond. Reprinted by permission of The Free Press/Simon and
Schuster. Please do not repost or recirculate without obtaining express permission in writing
from the author and The Free Press/Simon and Schuster.

by Marian Diamond, Ph.D.

How much more do we know about how the brain works than we knew before
we conducted our enriched and impoverished studies? We have learned a great
deal about the interaction of the external and internal environment with the
structure of the brain. We have learned that different regions of the cortex
increase in size as the duration of exposure to the stimulating conditions is
extended. We have learned that every layer of cortical neurons in area 18, the
area responsible for visual integration, responds to our experimental
environment, with outer layers, II and III, showing the greatest changes. The
neurons in the cerebral cortex exhibit an impressive amount of plasticity. We
have learned that every part of the nerve cell from soma to synapse alters its
dimensions in response to the environment.
The enlarged nerve cells with their more numerous glial support cells are
apparently utilized by the rat to solve maze problems more effectively than rats
without such modified cells. The mechanism by which the enlarged nerve cells
improve learning ability is not yet known, but these findings clearly
demonstrate brain enlargement as a result of brain use. One often wonders how
we can hold a train of thought for hours or record a memory for an extended
period of 90 years or more if the flexibility of the cortical structures is so great.
Obviously, some molecular configurations must remain stable at the same time
that others exhibit change.
Just as the cortical neurons become larger in a stimulating environment, they
decrease in size when there is less input from the millions of sensory receptors
reporting from the body surface and the internal organs. It is just as important
to stress the fact that decreased stimulation will diminish a nerve cell's
dendrites as it is to stress that increased stimulation will enlarge the dendrite
tree. We have seen how readily the cortical thickness diminishes with an
impoverished environment, and at times, the effects of impoverishment are
greater than those brought about by a comparable period of enrichment. These
cellular changes that we have measured in the brain provide us with a better
understanding of how the environment interacting with an hereditary base
possibly influenced the brains of higher organisms, including the human brain.
Those members of species which happened-by genetic happenstance-to have
free extremities, a tendency to explore, and/or bigger brains, were better able to
survive and pass on those genes. The upright human, with free upper
extremities, continuously sought new challenges, new enriched conditions, and
in turn could alter the dimensions of his brain. It is the interaction of the
environment with heredity which has changed the brain over millions of years.

Perhaps the single most valuable piece of information learned from all our
studies is that structural differences can be detected in the cerebral cortices of
animals exposed at any age to different levels of stimulation in the environment.
First, we found that young animals placed in enriched environments just after
weaning developed measurable changes in cortical morphology. Then, we
worked backward in age to the animals not yet weaned and found such changes,
and we even found measurable effects of pre natal enrichment. Later, we moved
forward in age to learn that the enriched young adult demonstrated an increase
in dendritic growth, not only above that found in his impoverished mates, but
even above the level of the standard colony animal. In the very old animal, with
the cortex following its normal decline with aging, we again found the enriched
cortex significantly thicker than the nonenriched. In fact, at every age studied,
we have shown anatomical effects due to enrichment or impoverishment. The
results from enriched animals provide a degree of optimism about the potential
of the brain in elderly human beings, just as the effects of impoverishment warn
us of the deleterious consequences of inactivity. Our ultimate goal in studying
the aging animal brain is to bring as much dignity as possible to the aging
human being, to indicate the potential of aging cerebral cortical cells, and to
challenge the myths regarding the aging brain by critically evaluating them.
For example, one of the most prevalent popular beliefs is that once we reach
adulthood our brain cells are dying by the hundreds each day and therefore our
mental capacities must be diminishing as well. The belief received support in
1958, when Benedict Burns calculated from Brody's data and Leboucq's data
that during every day of our adult life more than 100,000 neurons die. These
depressing data were derived in the following manner. Brody's estimation of
neuron loss in the human cortex between 20 and 80 years of age was 30%, and
Leboucq found a decrease in surface area of the brain between the ages of 20
and 76 years amounting to some 10%. Burns' estimated daily cell loss has been
frequently quoted. More recently, however, Brody noted the prominence of this
information in the lay literature and rejected it as scientifically inaccurate. The
original studies included too few samples, and inadequate information was
available about the living conditions of the brain donors prior to autopsy.
Furthermore, Brody has since reported that some areas of the brain do not lose
nerve cells at all with aging, a finding similar to our own. Apparently, the loss of
cells varies from region to region. For example, the locus ceruleus in the
hindbrain and the nucleus of Meynert in the forebrain do lose nerve cells with
aging; whereas, several of the cranial nerve nuclei and a nucleus in the
hindbrain called the inferior olivary nucleus do not lose nerve cells throughout
the lifetime of the individual.
There is some evidence that the decrease in brain weight and the degree of
cortical atrophy in healthy old subjects who have no brain pathology is relatively
slight. The brains of such individuals are within normal weight ranges for young
adults and have cerebral hemispheres exhibiting no apparent cortical atrophy.
Evidence does indicate that the number of the spines on cerebral cortical nerve
cells are reduced in old individuals. But even spines can still be present in active
old nerve cells; at least , they are clearly present in animals two-thirds of the
way through a lifespan. In studying the brains of old human beings it is
important to be aware of the lifestyle prior to death, something scientists have
been taking more seriously in recent studies. With such considerations, some

medical texts now state that in many respects the healthy old brain is similar to
the healthy young brain. The experimental environment is a major factor in
maintaining the healthy old brain. A few of the myths about the deterioration of
functioning during aging are slowly being replaced as scientific knowledge is
beginning to offer some contrary evidence.
Such information stimulates us to adopt new attitudes toward aging and
encourages us to plan for an active life in old age. Of course, many bright,
energetic individuals have always done this; the knowledge of the potential of
the brain was not a necessary inducement. Many people have looked to their
grandparents who lived a long full life and concluded that they too could follow
a similar path. There is no doubt that one's genetic background is important, but
our studies suggest that the use of our nerve cells is critical to their continued
health. Interviews with some active elderly supported this view. For example,
the 89-year-old California wine taster still had his acute taste buds as well as a
keen olfactory sense for sniffing good wines. The perspective developed in this
book suggests that his continuous attention to his senses of taste and smell
enabled them to remain acute during aging. The university chemist active at 98,
was still publishing and reading without his glasses. His alert 92-year-old wife
continued to read out loud to him. We all know older people like these whose
lives illustrate what we have learned about the potential of the cortical nerve
cells to respond to the information coming in from the environment.
But what about the millions of human beings who are discouraged and do not
continue to stimulate their brains? Many people attend school for a dozen or so
years and then find a job only to provide an income until retirement. Their
living pattern usually moves toward slowing down until they finally fade away,
The generally accepted knowledge about the brain is that it starts "going
downhill" fairly early in life (which is true) and that after that, there is little one
can do about changing this pattern ( which is not true). As mentioned in
Chapter 2, recent studies on the developing human brain have shown that the
size of the cerebral cortex is already decreasing after the age of ten. In fact, the
patterns of an increasing and subsequently decreasing curve were very similar
for rats and humans during the early postnatal period. If we take advantage of
our more recent knowledge regarding plasticity of a lower mammalian cortex at
any age, then we can offer encouragement to counteract the downhill slope in
human beings. A different outlook emerges toward lifestyle, as a whole, and
toward learning, in particular.
Opportunities for learning should be encouraged from shortly after conception
and continued until death. The data from a Japanese laboratory and from ours
showed the beneficial effects of stimulating environments during intrauterine
life: improved maze behavior and increases in cortical structure in the animals
after birth. Though the western world is only recently becoming aware of such a
practice, for centuries Asian people have encouraged the pregnant mother to
enrich her developing fetus by having pleasant thoughts and avoiding angry,
disturbing behavior. At the same time, one is made aware of other beneficial
factors in aiding the development of the fetus such as good diets and plenty of
exercise after which the dendritic trees in cortical nerve cells are richly
developed. On the other hand, mothers need to be alerted to the negative effects
of fetal development of such substances as alcohol. Alcohol administered to

pregnant rats (5% alcohol in a protein-rich diet throughout gestation from day
2) has been reported to cause a decrease in the body size of cortical pyramidal
cells and in their number of dendrites in the brains of the offspring. Other
results have shown that the nerve cells adjacent to the ventricles in the brain are
also defective in rats exposed to alcohol before birth. Thus, the prenatal brain
has been shown to be sensitive to negative influences like alcohol and
malnutrition as well as to the positive influence of enrichment.
We still do not know whether an enriched condition during pregnancy can
prevent some of the massive nerve cell loss, as much as 50% to 65% of the total
population of cells, which occurs during the development of the fetus. It is
apparent that overproduction of neurons occurs in the fetus because most
neurons do not reproduce themselves after being formed: an excess number is
needed as a safety factor. Therefore, those that are not involved in the early
neuronal processing are "weeded out." At the present time it is believed that the
limits of cell number are set by the same genetic constitution. As mentioned in
Chapter 1, investigators found the same number of cells in a single column of
cortical cells, in rats, cats, dogs, monkeys and human beings. The genetic
regulation of these cells appears to transcend species. Understanding the causes
of this constancy in number is a complex process, for even fluctuations in body
temperature can influence brain cell number. The body temperature of the
pregnant female has a marked influence on the number of neurons that survive
in the fetus. If the temperature is increased in the female guinea pig by 3 to 4
degrees C for I hour in the latter part of gestation, the fetal brain weight is
reduced by 10 percent. This reduction in brain weight is due to a loss of brain
cells. Hyperthermia has not yet been established as a cause of human fetal brain
damage and mental retardation, but we should be alerted to this possibility
whether studying animals or man.
Though enriched experimental environments have not been shown to alter the
number of nerve cells, our results have indicated that variation in the
experimental environment can readily alter the size of the preexisting nerve
cells in the cerebral cortex, whether in the cell body or in its rich membrane
extensions, the dendrites, or in synapses. The importance of stimulation for the
well-being of the nerve cells has been demonstrated in many species. But of
equally weighty significance is the possible detrimental effect of too much
stimulation. The eternal question arises, When is enough enough or too much
too much? The reputed pediatrician, T. Berry Brazelton, points out that infants
exposed to too much stimulation respond either by crying, by extending their
periods of sleep, or by developing colic or withdrawing from any new
approaches. In providing increased stimulation for the young, the adult, or the
old, one always has to keep in mind the need for adequate time at each phase of
information processing: input, assimilation, and output. The integration of the
input is essential before we can anticipate a meaningful output. As adults, we
frequently say, "Let me think things over." It is essential to give the infant the
same opportunity.
We have learned from our results that the nervous system possesses not just a
"morning" of plasticity, but an "afternoon" and an "evening" as well. It is
essential not to force a continuous stream of information into the developing
brain but to allow for periods of consolidation and assimilation in between. I

often tell the overworked student to go out and just be on the lawn and watch
the clouds drift slowly by. We do not yet know the true capacity or potential of
the brain. Our data at present suggest that nerve cells benefit from "moderate"
sources of stimulation, allowing for new connections to be formed, and thus
providing the substrate for more options. We have yet to try too much
stimulation. Will the stimulated brain continue to increase or will its reticular
formation sift out the excess stimulation?
To date we do not know whether there is a "ceiling" effect on brain growth
beyond which no further expansion will occur. In our rats in the preweaning
stage, one area (area 39) differed as much as 16% between the brains from the
enriched rats and those from the nonenriched. A Swiss group, using
superenriched conditions, including additional space and toys, were also able to
produce 16% differences in rats past the age of weaning. Does this mean that
16% cortical thickness differences represent the maximum change we can
induce with environmental enrichment? I hesitate to accept such a premise at
this time. Undoubtedly, with more imaginative experimental designs, utilizing
additional creativity, we will find greater responses in the future. Of course,
quantity of brain tissue is not our only goal. Quality is the ultimate objective. So
far it has been shown that the thicker cortex is positively correlated with a better
maze performance. Only further studies will provide information on the actual
potential of the cerebral cortex to alter its structure with increased stimulation.
I recently uncovered a small book published in 1901 by the Macmillan Company
called The Education of the Nervous System, by Reuben Halleck. In essence, its
message was that the best education we can provide the developing nervous
system is one of stimulating all the five known senses. Halleck wrote that a
person who has only one or even two senses properly trained is at best a pitiful
fraction of a human being. He points out that recalled images of sensations we
receive from the world around us are powerful and necessary aids in further
modifying and developing the sensory cells; not images of sight alone, but of
every sense. What does lilac smell like? How do tastes of cinnamon and nutmeg
differ? It is possible for us to conscientiously train our senses, all of them, at any
time in our lives. If we fine-tune the primary sensory areas early, the association
cortices might then respond to more subtle differences in a greater variety of
ways. Creative ideas could arise from a broader experiential base. The finding of
more widespread changes in the brains of enriched rats than in those of rats
trained to learn a specific task supports the claims of numerous educators, from
Dewey on, that providing a wide variety of experiences to the growing child
enhances intellectual development.
While all sensory input facilitates learning, the visual association cortex was the
first to be responsive to enrichment in our experiments. This may be related to
the fact that cortical association areas are the last areas to develop
embryologically and the most recent phylogenetically. Thus, it is reasonable for
the visual association area to show morphological changes in response to
stimulation in a learning circuit. As far back as 1901, Fleschig proposed that
learning took place by impulses first entering a primary sensory cortical area,
then going to the secondary or association cortex, and then into the limbic
system. For visual input in Fleschig's model, the primary sensory cortex would
be area 17 and the secondary or association cortex would be area 18, and then

continuing to the limbic system. Within this pathway we might anticipate area
18 to be the region most likely to show change. And we find that it does respond
in the shortest period of time to our experimental conditions. With a longer
duration of exposure to the environmental conditions, area 17, the primary
visual cortex, also demonstrates cellular changes. On the other hand, one part of
the limbic system we have measured, the male hippocampus, has not
demonstrated the same degree of plasticity as has the occipital cortex. However,
some investigators have shown small amounts of hippocampal increases with
enriched environments using female rats. Our findings offer support to our
hypothesis that neural activity within the visual cortex is important for the
initial information processing that facilitates learning. Our results indicate that
it is the posterior part of the cortex rather than the frontal cortex which
possesses the most plasticity. Future studies on the biochemistry of learning and
memory in the mammalian cortex might therefore be most appropriately
focused on this posterior region.
Though we have demonstrated the plasticity of the cerebral cortex, we are very
much aware that the brain does not work by itself. Healthy support systems, i.e.,
the cardiovascular, respiratory, urinary, and digestive systems, are essential to
the maintenance of the healthy brain. The heart and its accompanying blood
vessels need to be maintained through balanced diet and exercise. With
exercise, the connective tissue surrounding the skeletal muscles and blood
vessels can remain strong and aid with efficient circulation of the blood. The
lungs should be free of disease, such as emphysema which can be caused by
smoking or breathing air contaminated with pollutants. The body needs to take
in adequate fluids to keep the kidneys working efficiently, these, in turn, keep
the blood free of concentrated waste products. The digestive system needs to
benefit from strong teeth that can break down food for efficient digestion, and
from a fibrous diet to maintain the well-being of the large intestine. All of this is
nothing new. It was Plato who said, back in 400 B.C., that a healthy body
promotes a healthy brain and a healthy brain, a healthy body.
Not only is the brain dependent upon other systems, but each part of the brain
interacts with another. The cortex, with its more refined intellectual functions,
attempts to coordinate with the limbic system, with its more emotional
functions. One without the other is only half an experience. In Nathaniel
Hawthorne's story Ethan Brand , his main character is searching for the
unpardonable sin. He concentrates to such an extent on his intellectual pursuit
that he becomes emotionally starved. He eventually becomes dismayed and
throws himself into his fiery kiln. When others discover the remains, all that is
left is his charred rib cage enclosing a cold marble heart. He had discovered the
unpardonable sin by neglecting to integrate the warm, emotional heart, in a
metaphorical sense, with his intellectual pursuit.
Satisfying emotional needs is essential at any age. As we learned from our
studies on aging rats, by giving our old rats a little tender loving care, we were
able to increase their life span; those rats that received additional attention lived
longer that those that did not. These results imply that the two regions of the
brain, the limbic system and the cortex, need to work together efficiently for the
well-being of the whole individual. Thus it is important to stimulate the portion
of the brain that initiates emotional expression, which encompasses the

connections between the cerebral cortex and the limbic system, including the
hippocampus, amygdala and hypothalamus. In our studies it was the cortex
which responded more readily to the environmental conditions and not those
parts o f the limbic system which we measured, such as the hippocampus and
amygdala. The fact that these structures are less adaptable to a varied
environment implies that they are more basic to the survival of the individual,
suggesting that emotional well-being may be more essential for survival than
intellectual. Other kinds of stimulation besides mental challenges, e.g.,
considerable personal attention and other forms of emotional involvement, may
be essential to create changes in limbic structures. If this is so, how much more
effort should we be making toward giving attention and care to each other? And
how important it is for the intellectual components of the brain to be taught
ways to guide the emotional ones. Several of our measurements have indicated
that even the deprived brain can adapt by changing in structure as a result of
enriched living conditions. Such changes were discovered in both prenatally and
postnatally deprived animals. If mother rats were protein-deprived during
pregnancy and lactation and their newborn pups were given both protein-rich
diets and enriched living conditions after birth, the pups' brains grew more than
those of standard colony rats that were only nutritionally rehabilitated. In the
experiment dealing with postnatal deprivation, the cerebral cortices were
lesioned or damaged during infancy, and then the animals were placed in
enriched living conditions. Upon measuring the length of these animals' cortices
after enrichment, the investigators found that the length had increased to
compensate for the previous damage. Thus, we must not give up an people who
begin life under unfavorable conditions. Environmental enrichment has the
potential to enhance their brain development too, depending upon the degree of
severity of the insult.
In this limitless field of possibilities for future study, a few specific research
avenues beckon most immediately. We wish to gain a better understanding
about what elements are responsible for the growth of the cerebral cortex. In
Chapter 1 the normal cortical growth pattern was presented, but we do not know
what causes the rapidly growing cortex to reverse its direction shortly after
birth. At present we are studying the role of opioid blockers, substances which
block the endogamous or natural opiates (opioids) in the brain, because some
investigators have demonstrated that various regions of the brain increase in
size and cellular content when opioid receptors are blocked. Our initial results
support these findings. In addition we wish to learn more about nerve growth
factor and its role in cortical development. For years, nerve growth factor was
thought to be confined to specific regions such as the sympathetic ganglia, but
more recently it has been found in the hippocampus and cerebral cortex. We are
particularly interested in its role and when it is active in the cerebral cortex. In
light of the large number of people who are taking drugs for therapeutic reasons
for long periods of time, it is important to learn more about how the cerebral
cortex responds under such medication. Thus, one specific question I would like
to pursue now is whether the enriched condition favorably alters the brains of
animals on antiepileptic drugs. For example, it is possible to obtain a type of
rodent, a gerbil, which has spontaneous seizures. If this animal is given a
seizure-reducing drug, will the enriched environment still increase the cortical
dimension?

In addition, we wish to pursue a study to determine what agents play a role in


creating the enlarged cortex in the offspring, the F1 and F2 generations, from
the enriched parents. The high level of progesterone during pregnancy was
suggested as one possible responsible factor. This suggestion has to be tested as
well as others.
The ultimate goal of all of our studies has been to gain a better understanding of
human behavior by examining its source, the brain. The simple enriched
environmental paradigm allowing rats to interact with toys in their cages
produced anatomical changes in the cerebral cortex. Now how do we apply this
knowledge for the benefit of people? Since no two human brains are exactly
alike, no enriched environment will completely satisfy any two individuals for
an extended period of time. The range of enriched environments for human
beings is endless. For some, interacting with objects is gratifying; for others,
obtaining information is rewarding; and for still others, working with creative
ideas is most enjoyable. But no matter what form enrichment takes, it is the
challenge to the nerve calls which is important. In one experiment where the
rats could watch other rats "play" with their toys but could not play themselves,
the brains of the observer rats did not show measurable changes. These data
indicate that passive observation is not enough; one must interact with the
environment. One way to be certain of continued enrichment is to maintain
curiosity throughout a lifetime. Always asking questions of yourself or others
and in turn seeking out the answers provides continual challenge to nerve cells.
Finally, now that we have begun to appreciate the plasticity of our cerebral
cortex, the seat of the intellectual functioning that distinguishes us as human
beings, we must learn to use this knowledge. It must stimulate and guide our
efforts to work toward enriching heredity through enriching the environment ...
for everyone ... at any age.

Marian Diamond
Dr. Diamond is professor of Anatomy at the University of California, Berkeley, and is a former
Director of the Lawrence Hall of Science. She has also taught at Harvard, Cornell, and at
universities in China, Australia, and Africa. She received the Outsta nding Teaching Award and
Distinguished Teacher's Award from the University of California, and is a member of the
American Association of University Women Hall of Fame. In 1989-90, she received the CASE
Award, California Professor of the Year, and National Gold Medalist, and she was made a
member of the San Francisco Chronicle Hall of Fame.

What are the Determinants of Children's Academic


Successes and Difficulties?
by Marian Cleeves Diamond, Ph.D

Environmental factors can influence both the pre- and postnatal brain. Most of
us are aware that in the Eastern world the concept of "intrauterine education"
goes back in Chinese literature over 2000 years.
I could find one reference published in the Ming Dynasty in 1237 when women
were advised to behave favorably during pregnancy so that their offspring would
be bright and survive well. Good behavior included: "Sit and walk dignified and
sedately; maintain a good temper and with a mind at ease; do not look at evil
happenings and ugly pictures; etc. In the Western world Omraam Aivahov
stated in 1938 that the State should concentrate all of its attention on its
pregnant mothers. In our laboratory we have been able to demonstrate the
effects of enriched environments on the prenatal rat brain. So when one
addresses determinants of academic successes and difficulties in today's world,
one really should take into consideration pre- as well as postnatal conditions for
forming healthy children and their precious brains.
How can parents and teachers provide conditions that will most effectively
promote growth and change in our children's nerve cells with their branching
dendrites? Dendrites are like the trees of the mind, growing like poplars in the
sun. How can parents help a child to develop his or her full potential and set a
pathway of lifelong learning? Parents and teachers should create a climate for
enchanted minds to obtain information, stimulate imagination, develop an
atmosphere to enhance motivation and creativity and discover the value of a
work ethic.
Innately, children are very adaptable and feel natural in any comfortable pattern
a parent may set. In spite of what has been said, parents can play a very active
role in developing a child's behavior, for children spend far more time outside of
school than in, reportedly 80% more.
Our recipe for an enriched environment to determine academic success:

1. Includes setting the stage for enriching the cortex by first providing a

steady source of positive emotional support - love, encouragement,


warmth and caring. Our old rats live longer with tender loving care.
2. Provides a nutritious diet with enough protein, vitamins, minerals and
calories. We have shown that with a low protein diet during development
the branches on the nerve cells in the cortex do not flourish to be able to
respond to an enriched condition.
3. Stimulates all the senses, but not necessarily all at once. A multisensory
enrichment develops all of the cortex; whereas, an input from a single

task stimulates the growth of only a precise area of the brain. One
example, growing up responsibly on a farm surrounded by clean fresh air
with all of its multisensory input supplies a wealth of varied stimuli to
develop a cortex
4. Has an atmosphere free of undue pressure and stress but suffused with a
degree of pleasurable intensity
5. Presents a series of novel challenges that are neither too easy nor too
difficult for the child at his or her stage of development
6. Allows for social interaction for a significant percentage of activities;
there is no doubt peers are intrigued with and enjoy each other.
7. Promotes the development of a broad range of skills and interests that
are mental, physical, aesthetic, social and emotional
8. Gives the child an opportunity to choose many of his or her own
activities. Allow each unique brain to choose.
9. Gives the child a chance to assess the results of his or her efforts and to
modify them. As he builds sand castles on the beach and admires his
construction before a wave destroys them and he need to earn to start
over and resculpt.
10.An enjoyable atmosphere that promotes exploration and the fun of
learning; rats living in enriched environments are more exploratory than
those living in impoverishment.
11. Above all, enriched environments allow the child to be an active
participant rather than a passive observer; a healthy body will have the
energy to become involved.
A nonenriched, impoverished environment, which can cause difficulties and a
lack of success, will tend to be opposite in most of these ways, including:

1. A vacillating or negative emotional climate


2. A diet low in protein, vitamins, and minerals, and too high or too low in

calories
3. Sensory deprivation
4. High levels of stress and pressure
5. Unchanging conditions lacking in novelty
6. Long periods of isolation from caregivers and/or peers
7. A heavy, dull atmosphere lacking in fun or in a sense of exploration and
the joy of learning
8. A passive, rather than active involvement in some or all activities
9. Little personal choice of activities
10.Little chance to evaluate results or effects and change to different
activities
11. Development in a narrow, not panoramic, range of interests
We should also not underestimate the importance of supporting creativity with
imaginative toys, fantasy friends, a rich language environment, the value of
music and art, and the value of a mentor who cares and listens and will remain
in her mind when she later needs support.

Let me take one example, when it comes to providing toys and activities for
young children, there is, in general an inverse relationship between the
specificity and elaborateness of a toy and its ability to excite the imagination. A
cast-off cardboard box can become a doll house, a puppet stage, a school or an
alien planet. Tools for exploring, a magnifying glass, an old tape recorder, a
map, can open doors in the mind, as can the artifacts of make believe. The more
pressure the parent puts on a child to produce something specific, the harder it
is for the child to express creativity and imagination.
All of these have their role to play in brain development. BUT we have also
learned that too much stimulation is detrimental. The cerebral cortex does not
show significant growth with too much stimulation as it does with a moderate
amount. The brain needs time to transfer information into its association cortex.
Allow the child time to think about what is happening and what is coming next.
Allow for ample free time too. Creative efforts need time to utilize what has been
stored in the brain. DO NOT OVER STIMULATE.
In conclusion, I would like to cite an example from one of my own four children.
I recently had the courage to ask them the positives and negatives of having a
working mother. Ann, who majored in geology at Harvard, then after
graduating, played soccer for half a year before entering medical school at
UCSF, provided the following positives: you gave us the freedom to choose what
we wanted to do; you served as a role model showing if we worked hard we
could succeed; and we are proud of you. What parent/educator could ask for
more?

About the Author


Marian C. Diamond, Ph.D. is a neuroanatomist in the Department of Integrative Biology ay the
University of California, Berkeley. She is co-author of Magic Trees of the Mind. She is also a
member of New Horizons for Learning's International Advisory Board, advising us on the
development of the News from the Neurosciences area of the website.
What are the determinants of children's academic successes and difficulties? is excerpted from
a presentation by Dr. Diamond at an invitational conference Getting it Right About Children's
Development: The influences of nature and nurture, sponsored by the Harvard Children's
Initiative, Friday, February 5, 1999.

Male and Female Brains


Summary of Lecture for Women's Forum West Annual Meeting,
San Francisco, California 2003

by Marian Diamond

Sex differences and the brain. What does it matter, you say? I think it does.
Through such knowledge we will eventually be better able to understand the
basis for behaviors that many now perceive as entirely rooted in social custom
or familial history. From that understanding, we will gain the acceptance,
patience, and respect so vital to all human endeavor.
Interestingly, people who see a human brain for the first time often ask, "Is it
male or female?" Yet, for many millennia no one, even scientists, thought about
sex-related differences and similarities in the human brain. A brain was just a
brain. Now hardly a year goes by that we don't read authoritative studies
showing these differences. I was taken aback just a few months ago when, at a
Ph.D. examination dealing with Magnetic Resonance Imaging of human brains,
the student reported having pooled the data from both sexes. Even if the intent
was not to explore male-female differences, one can hardly expect to make
accurate interpretations from such mixed data.
Obviously, no single factor accounts for the gender-related differences we are
finding. We are slowly, one by one, unraveling the various integrative factors
involved in this mystery. A basic question being asked is whether the differences
between male and female brains outweigh the similarities or vice versa. Some
researchers report finding more differences within the sexes than between the
sexes. Please understand that the objective of my talk is not to discuss whether
the brain of one sex is superior to the brain of the other but to explore the
significance of the differences we are discovering in the brains of males and
females. As you might imagine, to conduct these studies, we need brain samples
so that we can make our comparisons. So far, no live human beings, males or
females, have been willing to give us their brain tissue to use in our
experiments. But all is not lost: The rat brain, oddly enough, has the basic
components and major structures in its little pecan-size brain that we humans
have in our large cantaloupe-size brain. In general terms, what we have learned
about the anatomy of the rat brain has later been replicated by studies in higher
mammals including humans. What is particularly important, of course, is that
using the laboratory rat allows us to control many variables--the sex, the age,
the living conditions, the diet, the water intake, the environment, and so forth,
thus assuring clear comparisons.
To appreciate the work we do, let me take a moment to give you some
fundamentals of the brain's anatomy. In the embryo our nervous system starts
as a simple tube, the head end forming the brain and the remainder forming the
spinal cord. The brain is divided into three parts: the hind brain, midbrain and
forebrain. Our interest is primarily in the forebrain, which expands

tremendously over the course of its development to form about 85% of our total
brain, called the cerebral hemispheres. These two large hemispheres are
familiar to anyone who has seen a picture of the brain The outer layers of the
cerebral hemispheres are called the cerebral cortex. (Cortex means bark.) With
the use of a light microscope we can easily measure the thickness of this cortex
in the rat because it is smooth and does not have folds as do more highly
evolved brains.
Factors affecting cortical thickness are the main interest in our gender studies
because the cerebral cortex is the most highly evolved part of the brain and
deals with higher cognitive processing. The cerebral cortex, like other parts of
the brain, consists of nerve cells with branches and functional connections
called synapses; glial cells, the metabolic and structural support cells for the
nerve cells; and blood vessels. Cortical thickness is a key factor; it gives us an
overall indication of what is happening collectively to these structures within the
cortex.
Table 1
Statistical significance of differences between right and left cerebral cortical thickness in male
and female rats ( S=statistically significant; NS=nonstatistically significant)

Cortical
Areas
Age
(days)

Males

6
14
20
90
185
400
900

N Frontal

Parietal

Occipital

10
13 S
17 S
15 S
15 NS
15 S
15 S
8 NS

4 3
S S
S S
S S
S S
NS S
NS S
NS NS

2 18 17
NS S S
NS S S
NS S S
NS NS S
S S S
S S S
NS NS NS

18A
S
S
NS
NS
S
NS
NS

15 NS
15 NS
15 NS
19 NS
11 NS
17 NS

NS NS
NS NS
NS S#
NS NS
NS NS
NS S#

NS NS NS
NS NS NS
NS NS NS
NS NS NS
NS NS NS
NS NS NS

NS
NS
NS
NS
NS
NS

All Ss R>L
7
14
Females 21
90
180
390
S#=L>R

The cortical areas sampled are seen in Figure 1. As the name implies, the frontal
area is in the front of the brain, the parietal is in the middle and the occipital is
at the back of the hemisphere. In very simple terms, the frontal area deals with

motor behavior and planning for action, the parietal area with general sensory
functions, and the occipital cortex with visual functions.
By measuring the thickness in the frontal, parietal and occipital cortex in our
experimental rats, we can begin to assemble important information and to ask
such questions as:
1. Are there sex-related differences in the growth of the cerebral cortex at birth?
The female cortex shows some areas are more highly developed than others at
birth compared to the male. Her motor cortex (frontal) shows the highest
development with her sensory cortex (parietal) next and visual cortex (occipital)
least developed. In the male brain, the motor, sensory and visual cortical all
show a similar degree of development at birth; differences in growth rates
appear soon after birth.
2. Is the thickness of the right and left cerebral cortex different between male
and female animals? The answer is decidedly "yes" as revealed by a glance at
Table 1. "N" shows the number of brains sampled in each age group from shortly
after birth to well into adulthood and for males into very old age. "S" means
there is a statistically significant difference between the cortical thickness in the
right and left hemispheres. "NS" means there is no statistically significant
difference between the thickness in the hemispheres.
In the female brain,we observe no statistically significant differences in cortical
thickness between the right and left hemispheres from birth well into
adulthood. We found nonsignificant differences in 41 of the 43 regions
measured; in other words, in 95% of the cases she displays a symmetrical cortex
It has been commonly stated that the female cortex is symmetrical and the male
cortex is asymmetrical. Turning again to Table 1, this time to assess the
development of the male cortex, we find that the hemispheric thickness
differences from birth to old age are definitely not as consistent as in the female
brain. In fact, in the male cortex the right hemisphere is significantly thicker
than the left in 31 of the 49 regions measured. In other words, in 60% of the
cases, the cortex of the male rat brain is significantly asymmetric.
With the data in Table 1, we now need to state more accurately that parts of the
male cortex are asymmetrical and parts are not. Two consistent findings in the
male rat brain were the following: (1) area 2 in the parietal cortex showed
nonsignificant findings or symmetry from birth to 90 days of age; differences in
cortical thickness were seen only after 90 days of age. (2) In the 900-day-old
male rats, all areas of the cortex showed nonsignificant differences between the
hemispheres. At this very old age, the male cortex appeared similar to that of the
female cortex in terms of its symmetry.
One obvious question to ask when we assess our findings in the female brain is:
What role do the sex steroid play in establishing cortical thickness ? As we
would expect, in those animals with ovaries, there is no significant difference in
the thickness of the hemispheres, but in those without ovaries, two areas of the
occipital cortex show significant differences in thickness between the right and
left hemispheres. It seems that the visual cortex in female animals without

ovarian hormones is more like that of the normal male cortex. (Though not
shown, in our 800 day old females, we also found this pattern was similar in the
occipital cortex.)
In summary, the female animal, with or without ovaries, shows no significant
difference in the thickness of her right and left cerebral cortices except in part of
the visual cortex where those without ovaries develop right dominance. Other
researchers have reported that two major connecting fiber tracts between the
two hemispheres are larger in females than in males, a finding that supports the
notion that the female exhibits symmetrical cortical patterns. What might be the
advantage of such symmetry in cortical morphology?
For the female animal, the main functions in life are to bear, protect and raise
her offspring. These roles challenge her to go in many directions, both
geographical and conceptual, something that may be more accessible and
readily achieved with a symmetrical brain. We might conjecture that the trend
to right dominance in the older brain of the female without ovarian hormones
suggests a shift to the more visual focus demanded of the male.
Now we need to ask the same question we asked of the female brain: What role
do sex steroid hormones play in determining cortical thickness patterns in the
male? In rats without testes some cerebral cortical areas show significant
differences and some do not. Of interest to me is that areas 17 and 18a dealing
with visual processing in both males and females devoid of sex steroid
hormones showed statistically significant differences between the right and left
hemispheres. Area 17 in the male also showed statistically significant differences
in the cortical thickness of the right and left hemispheres among animals with
circulating sex hormones, but area 18 a did not
In summary, the male cerebral cortex displays both symmetrical and
nonsymmetrical right/left patterns in cortical thickness with the
nonsymmetrical pattern being slightly more anatomically frequent (60%). and
in turn suggesting functionally more frequent. What might be the advantage of
having some cortical areas asymmetrical in the male? In general, male behavior
involves finding and defending his territory and finding his female, all rather
focused functions, possibly benefiting from an asymmetrical cortex.
Another consideration is the similarity between male and female; in these
studies, the question is in what areas do males and females have the same
right/left pattern, whether symmetrical or nonsymmetrical? In area 10 (motor
behavior and planning for action) at 90 days of age both are nonsymmetrical; in
area 2 (general sensory functions) from birth to 90 days of age both are
nonsymmetrical; in area 18A (visual functions) from 20-21 to 90 days of age
both are nonsymmetrical. In area 3 (general sensory functions) at 2-21 days of
age both are symmetrical and at 400-390 days of age both are symmetrical.
Needless to say, these data further emphasize the necessity of considering the
numerous variables that contribute to anatomical and in turn physiological
development generally and specifically to the growth of the cerebral cortex.
Furthermore, wresting meaning from the multiplicity of similarities and
differences between male and female brains presents a considerable challenge

in the decades ahead, but a challenge that those of us who dedicate our
professional lives to such research anticipate with relish.

About the author


Dr. Diamond is professor of Anatomy/Neuroanatomy at the University of California, Berkeley,
and is a former Director of the Lawrence Hall of Science. She did research at Harvard, and
taught at Cornell and the University of California at San Francisco and at Los Angeles, and at
universities in China, Australia, and Africa.
She received the Outstanding Teaching Award and Distinguished Teacher's Award from the
University of California, and is a member of the American Association of University Women
Hall of Fame. In 1989-90, she received the CASE Award, California Professor of the Year, and
National Gold Medallist, and she was made a member of the San Francisco Chronicle Hall of
Fame. She was the fourth woman to become the Alumnus of the Year at the University of
California at Berkeley. She is a Fellow of the American Association for the Advancement of
Science and a member of the California Academy of Science.
Marian C. Diamond, Ph.D.
Professor
3060 VLSB
University of California
Berkeley, CA 94720
Phone: 510-642-4547
FAX: 510-643-6264
e mail: diamond@socrates.berkeley.edu

Manufacturing Knowledge
by Donalee Markus

"I am summoned to see the headmistress at morning break on Monday," said Miss Brodie. "I
have no doubt Miss Mackay wishes to question my methods of instruction. It has happened
before. It will happen again. Meanwhile, I follow my principles of education and give my best in
my prime. The word 'education' comes from the root e from ex, out, and duco, I lead. It means a
leading out. To me education is a leading out of what is already there in the pupil's soul. To Miss
Mackay it is a putting in of something that is not there, and that is not what I call education, I
call it intrusion, from the Latin root prefix in meaning in and the stem trudo, I thrust. Miss
Mackay's method is to thrust a lot of information into a pupil's head; mine is a leading out of
knowledge, and that is true education as is proved by the root meaning. Now Miss Mackay has
accused me of putting ideas into my girls' heads, but in fact that is her practice and mine is quite
the opposite. Never let it be said that I put ideas into your heads.

In the passage quoted above from The Prime of Miss Jean Brodie, novelist
Muriel Spark entertainingly and succinctly exemplifies the question educators,
philosophers, and scientists have struggled with since the Golden Age of Greece:
What does it mean to educate? In other words, how do we learn what we know?
This was once dangerous ground to tread on. In 399 BC when Socrates was
asked why he was called "the wisest of all men," he said it was because he knew
that he knew nothing. His persistence in questioning the citizens of Athens to
learn how they knew what they knew ultimately cost him his life. For eons each
generation has asked the question. Yet it has never been completely answered.
But thanks to modern technology, we may be getting closer.
So How Do We Know?
In the early part of the 20th Century, human brains were thought to be storage
facilities. Each brain cell would serve as a kind of bin where facts could be neatly
deposited. The size of the brain was equated with the amount of storage space
available for knowledge. It was known that injury, illness, or age could reduce
the number of cells a person had, but it was thought that nothing could increase
them. The assumption was that brain size had a direct correlation with one's
ability to learn. From that assumption came two arguments:
1. People are born with a given intelligence that sets limits on what they can
learn
2. Since men generally have larger brains than women, opportunities for
acquiring greater knowledge would be wasted on most women
The rationale managed to keep many young women out of college and out of the
professions for many decades.
When biologists started measuring non-human brains in an effort to categorize
levels of intelligence among birds and beasts, they discovered that some
creatures had brains that were noticeably larger than the largest human brain
on record. A new way of measuring had to be devised. One attempted to

establish a ratio between brain size and body size. On average, primates have
brains that are 2.3 times larger than other mammals of the same body weight.
Although humans and chimpanzees have similar weights, our brains are 3 times
larger than theirs. Among non-primates, porpoises and dolphins have brainbody ratios equal to or greater than ours. Animal rights activists sometimes use
these numbers to argue the superiority of dolphin intelligence.
The idea that smart people must have more brain cells (a.k.a. "gray matter")
persisted until Albert Einstein's brain was made available for study by a select
group of neuroscientists. More than 25 years after Einstein's death in 1955,
microscopic examination revealed that his brain wasn't composed of more gray
matter but more "white matter." White matter refers to myelin, a protein
substance that forms around the thread-like projections that connect neurons to
each other. Myelin serves as an insulation that allows electrical impulses to flow
faster between brain cells. Einstein's neurons had more and stronger
connections with each other. There were also more glial cells, that support and
facilitate neural connections, in specific areas of his brain. This revelation led to
a new understanding of how brains function.
Advances in technology have allowed brains to be explored on two fronts:

Regionalvia
PET
Cellularvia electron microscope

scan,

MRI,

EEG,

etc.

Scanning mechanisms enable on-line, real time observation of brains in action


in other words, responding to specific stimuli. Although the technology cannot
pinpoint individual neurons, it does demonstrate the complex interaction of
different areas within the brainsuch as amygdala, hippocampus, parietal
lobes, temporal lobes, etc. Moreover, it shows that some of these areas can shift
and change over time and with experience. The electron microscope reveals
where these changes start.
One out of ten brain cells are neurons. They are the only cells in our bodies that
are known to communicate directly with one another. This isn't because the
other cells are antisocial. They just lack the means to do so. Neurons are
structurally different. In addition to the cell body, they have special appendages
(nerve fibers) that literally reach out to touch other neuronssometimes close
neighbors, sometimes cells several feet away which is more impressive when
you learn that 70,000 brain cells can fit on the head of a pin. Nerve fibers come
in two varieties: axons and dendrites. Axons provide outputi.e. they transmit.
Although they are usually in contact with dendrites, they can also connect with
cell bodies and other axons. Dendrites receive input, usually from axons, but
sometimes they "talk" to other dendrites.
Generally, neurons have one axon, but an axon can have multiple branches
enabling it to transmit to many cells at once. A neuron will usually have many
dendrites, and the dendrites have many spines (little knob-like projections) that
axons can connect to. All of this was unknown before the development of the
electron microscope which enables scientists to peer into the vast universe of
the very small.

You Must Remember This


At any given moment billions of messages are being exchanged via electrochemical activity within our brains. Most are involved with maintenance or
survivalsuch as regulating heartbeat, monitoring temperature, sensing
movement, etc. Unless there is a sudden or extreme change, we remain
blissfully unconscious of these goings-on. Just because we're unconscious of
them doesn't mean we have no memory of them.
There is increasing evidence to suggest that memories aren't stored in individual
neurons, but in the way the neurons are connected to each other. Some neurons
are part of vast networks or systems; others only have small, regional
connections. Some connections are well insulated and will last a lifetime; others
are more fragile than soap bubbles. They exist for a fraction of a second before
dissipating. Sometimes breaks occur within a system or between systems, due to
injury or illness. Disconnections result in the loss of functionssuch as a stroke
that renders an arm useless or a tumor that disrupts the ability to link faces with
names. Victims of such tragedies often suffer the frustration of knowing what to
do but not being able to remember how to do it.
In theory, memories are reconstructions of the past. In fact, memories are
thoroughfares by which we travel through time. The memory of our species is
embedded in our DNA and passed along via egg and sperm to the future.
Personal memories are woven throughout our brains, altered and adjusted with
each new experience, making our brains as individual as our fingerprints.
Different types of memories form in different parts of the brain. The amygdala,
an almond-shaped structure deep inside the temporal lobe, is fully formed and
functioning at birth. Fears take root here that may later germinate as phobias
and flashbacks, the unspeakable memories of things to avoid.
Nestled up against the amygdala, looking more like a paw than the seahorse for
which it was named, the hippocampus lays down new memories, establishes
personal history, and creates spatial awareness. The hippocampus is connected
to almost every part of the neocortex, the thin outer layer of the brain where
self-awareness, language, and abstract thought take form. An event does not
become a long-term memory until it's bounced back and forth between
hippocampus and cortex for two or three years. Much of the activity takes place
during sleep and in dreams. After that, the frontal and temporal cortices can
recall information without the aid of external stimuli.
Non-personal memories such as the capital of Vermont, the multiplication
tables, or the location of the spare car key also begin in the hippocampus and
end in the cortex. How quickly we can retrieve the information is dependent on
how often we need to remember it and its emotional value. This is why
cramming all night for a final can net a decent score on a test although the
hastily acquired knowledge is forgotten soon afterwards. It is also why
immersion into a culture is the quickest way to learn a second language. By
involving all the sensessight, sound, taste, smell, touch, and movementvast
networks of related information can be constructed in a short time. The process
of becoming an expert requires a similar immersion.

Development of expertise in any subject requires time and opportunity. But


exposure to information, no matter how intense, is not enough. For information
to be of use, it has to be linked to an objective and a procedure for achieving that
objective. Thoughts turn into actions when nerve impulses activate muscles in
an organized manner. Without synchronization, muscle movement is reduced to
tics and twitches. We would not be able to walk, talk, chew, write a check, kiss a
loved one, smile, scream, juggle, cookin short do anything but vegetate.
Movement is initially organized in the cerebellum, the "little brain" in the back
of the skull. The evolution of the cerebellum is linked to the need to stabilized
vision while the body is in motion, adjusting the rate and degree of eye
movement to that of head movement. Otherwise, the world would seem like it
was made with a hand-held video camera.
Coordination takes practice. Watch a baby learning to roll over, learning to
crawl, learning to walk, learning to talk. It all takes a lot of practice. And then
suddenly, it happens and it seems as if one has always known how to do these
things. At this point control shifts from the cerebellum to the putamen, a part of
the basal ganglia that sits atop the brain stem. Activities learned through
repetition become automatic here. We don't have to think about the process
anymore. Muscle memory takes over. Actions flow smoothly, one after another.
Choices are made swiftly and executed masterfully. Some activities, like walking
in one's sleep, can even be performed unconsciously.
How Knowledge Is Made
Most of the memory discussed thus far can be categorized as "long-term." It is
part and parcel of who we think we are and what we habitually do. In short, we
owe our self-knowledge to long-term memory. Certainly some of it is genetic,
lending a distinctly human quality to our lives. Animals such as chimpanzees
and gorillas, with whom we share a great percentage of DNA, exhibit more
"human-like" behaviors than do dogs, parakeets, or houseflies despite the fact
that we have more experience with pets and pests than we do with apes. But we
are also molded by experience. Physically, we may very well be a reflection of
what we eat. Mentally, we become what we think.
What we think is a turbulent, ever-changing blend of past, present, and future
prepared in a cauldron called "working memory." More than just a short-term
or temporary storage facility, working memory is the means by which past
experiences are combined with present events to produce future effects. It is
where knowledge is processed through planning, problem solving, and decisionmaking. We employ our working memories when we compare prices at the
supermarket, carry on a conversation, solve crossword puzzles, balance check
books, find our way home from a party, compose music, write novels, design
nuclear reactors, etc.
All mammals have frontal lobes whose primary task is to control movement. In
humans the frontal lobes occupy one-third of the brain. From there, in the
prefrontal cortex, the working memory works its miracles. The prefrontal cortex
is a convergence zone where electro-chemical messages sent from systems
throughout the brain meet, greet, and are sent on their way back into the brain

to change it. Here emotions, sensations, conditioned responses, and motivations


blend. Some of these blends will be weak and erratic and quickly forgotten;
others will be strong enough or frequent enough to forge new connections
between neurons. In this way knowledge is built upon what we already know.
This is why it is easiest to learn something new when we can associate it with
something familiar.
Sometimes new experiences can challenge or disrupt old patterns of behavior.
Then we feel confused. Uncertainty can generate hyper-awareness and large
muscle paralysis. The discomfort is a sign that our working memory is seeking
more information from various systems and inhibiting impulsive action until an
appropriate response is determined. If we can tolerate the discomfort, we will be
rewarded with a new insight and an awareness that we've been forever changed.
In other words, we'll have been educated.
Education is the result of a leading out of what is already there (i.e. long-term
memories) and a thrusting in of new experiences, ideas, sensations, and/or
feelings. Education enriches our lives by enriching our brains with neural
connections. We can know and know that we know because of a wonderful
system called working memory. Like all brain systems, working memory
improves most with intentional usefor instance by playing games, solving
puzzles, and socializing. In the process we learn both how to absorb knowledge
and how to create it.

About the author

Donalee Markus earned her Ph.D. from Northwestern University in Administrative and
Management Sciences in 1982. She received her Bachelor's and Masters Degrees at NationalLouis University. She did post-doctoral work from 1983 through 1993 with renowned Israeli
psychologist Reuven Feuerstein. Donalee's clinical and corporate experiences encouraged her to
create Designs for Strong Minds, a cognitive restructuring program designed for highfunctioning adults inspired by the work of Professor Feuerstein. The theoretical framework for
DSM is based on how the brain learns, the neuromuscular system.
DSM has been recognized by NASA as "the only graphically-based problem-solving program
that uses context-free game-like exercises that create the disposition for critical thinking and
creativity." Her corporate clients have included Los Alamos National Laboratory, McDonalds,
Ameritech, Coopers & Lybrand, Britannica, Quaker Oats, United States Federal Court System
and NASA. She has most recently completed a pilot study and several critical thinking training
programs with top NASA scientists, with whom she continues to work. Her book, Retrain Your
Business Brain: Outsmart the Corporate Competition published by Dearborn Press is due on
bookshelves September 2003.
To learn more about Donalee Markus and Designs for Strong Minds and to play over 40
interactive sample games, visit her website:www.DesignsForStrongMinds.com.
See also Optimizing Memory in the Adult Brain for Effectiveness in a Multitasking Society by
Donalee Markus on this site.
You can contact Donalee by email via info@designsforstrongminds.com.

The Treasure at the Bottom of the Brain


by Henrietta C. Leiner and Alan L. Leiner

One of the most impressive parts of the human brain, named the cerebellum,
has been underestimated for centuries. Located at the lower back of the brain, it
is a fist-sized structure whose function is now being reappraised. Formerly this
structure was thought to have only a motor function, which it performed by
helping other motor regions of the brain to do their work effectively. But during
the past decade a broader view of its function has emerged as a result of new
research, and now the cerebellum is regarded as a structure that can help not
only motor but also nonmotor regions to do their work effectively. In fact, the
cerebellum has been compared to a powerful computer, capable of making
contributions both to the motor dexterity and to the mental dexterity of
humans, both of which are required for the emergence of fluent human
language.
This powerful mechanism at the bottom of the brain, which every person
inherits as a birthright, is immature at birth but develops through childhood
and adolescence, reaching its full structural growth by the 15th to 20th year of
life. Perhaps the reason why it has traditionally been underestimated is its lowlevel location in the brain, which contrasts with the high-level location of the
structures that are thought to subserve higher mental functions. Such locations
in the brain become irrelevant, however, when a structure is regarded as a
computer because a computer's processing power depends not on where it is but
on what it contains and to what it is connected.
Judged by what it contains and by its external connections, the human
cerebellum is an enormously impressive mechanism. First of all, it contains
more nerve cells (neurons) than all the rest of the brain combined. Second, it is
a more rapidly acting mechanism than any other part of the brain, and therefore
it can process quickly whatever information it receives from other parts of the
brain. Third, it receives an enormous amount of information from the highest
level of the human brain (the cerebral cortex), which is connected to the human
cerebellum by approximately 40 million nerve fibers. To appreciate what a
torrent of information these 40 million fibers can send down from the cerebral
cortex to the cerebellum, a comparison can be made with the optic fibers in the
human brain. The optic tract contains approximately one million nerve fibers,
which transmit to the brain the visual information that a human receives via the
eyes. Forty times that much information can be sent from the cerebral cortex
down to the cerebellum, including information from sensory areas of the
cerebral cortex, from motor areas, from cognitive areas, from language areas,
and even from areas involved in emotional functions.
As this torrent of information continues to pour into the cerebellum from many
other parts of the brain, and as the cerebellum continues to process this
information within its neural mechanism, a flow of output information is
produced by it which it can send to various other regions of the brain, telling

them what to do and when to do it. How can the cerebellum convey such
messages? A clue is provided by its internal structure and its output
connections, which bear a remarkable resemblance to the design that is
employed in organizing modern computing machines.
Resemblance to Computing Machines
In computing machines the processing of information is accomplished by both
the hardware in the system (its circuitry) and by the software (the messages
transmitted between the various parts of its circuitry). Together the hardware
and software can produce a versatile information-processing system, capable of
performing a wide variety of functions, including motor, sensory, cognitive, and
linguistic ones. Such versatility of function is achieved by organizing the
computer hardware in the following way: The basic components are assembled
into modular packages that contain similar circuitry, and large numbers of such
similar modules are organized into parallel processing networks. This structural
organization is exemplified also in the cerebellum: It consists of longitudinal
modules containing similar neural circuits, which are arrayed in parallel zones
throughout the entire extent of the structure.
On the basis of this known cerebellar "hardware", it is possible to investigate the
"software" capabilities of such an organization of modules. In investigating the
part of the cerebellum that is greatly enlarged in the human brain, investigators
found that each module in this part of the cerebellum (the lateral part) is able to
communicate with the cerebral cortex by sending out signals over a segregated
bundle of nerve fibers. This is a particularly powerful way of communicating
complex information. It is exemplified also in computing machines, where the
"fibers" (i.e., the wires connecting the modules) also are organized into
segregated bundles. The benefits of such bundling of fibers are linguistic; such
organization enables the cerebellum to communicate with the cerebral cortex at
a high level of discourse, by using internal languages that are capable of
conveying complex information about what to do and when to do it.
Functions of the Cerebellum
Given that the cerebellum seems well organized to convey complex information
to many other regions of the brain, where does it actually send this information?
Each module of the cerebellum seems to be uniquely connected, both through
its input and output connections, with different regions of the brain. Modules in
the middle of the cerebellum (in the medial part) receive different input and
send information to different output targets than do the modules in the lateral
part of the cerebellum. Despite such differences in input and output, however,
the circuitry within each module seems to be similar to that in every other
module. For this reason, the basic processing that every module can perform on
the incoming information would seem to be similar, no matter whether this
incoming information represents motor, sensory, cognitive, linguistic, or any
other kind of information.
What does this basic processing do? More specifically, what computations are
performed by the similar cerebellar circuits in each module, and to what use are
these computations put when the results are sent to the different target

structures in the other regions of the brain? Many theories about such cerebellar
functions are under investigation, but definitive answers are not yet available.
They await further research.
Although many of these theories are considered controversial at present, it
seems possible that each of them may be at least partially correct and that the
present controversies can therefore be reconciled in the future. The present
proposals encompass not only the traditional view that the cerebellum is
involved in skilled motor performance but also the broader view that it is
involved in skilled mental performance, and is also involved in various sensory
functions including sensory acquisition, discrimination, tracking and prediction.
A recent theory that is broad enough to encompass all of these motor, mental,
and sensory functions has proposed that the cerebellum does the following basic
processing: It makes predictions (based on prior experience or learning) about
the internal conditions that are needed to perform a sequence of tasks in other
regions of the brain, and it sets up such internal conditions in those regions
automatically, thus preparing those regions for the optimal performance of the
tasks. By doing this, the powerful and versatile computing capabilities of the
cerebellum would be used for providing automatic help to various other regions
of the brain, helping them to do their work better.
The Advantages of Automation
Experimental evidence has shown that the cerebellum is involved in the process
by which novel motor tasks can, after some practice, be performed
automatically. Through such automation, the performance can be improved:
Sequences of movements can be made with greater speed, greater accuracy, and
less effort. The cerebellum also is known to be involved in the mental rehearsal
of motor tasks, which also can improve performance and make it more skilled.
Because the cerebellum is connected to regions of the brain that perform not
only motor but also mental and sensory tasks, it can automatize not only motor
but also mental and sensory skills in the human brain. As with motor skills,
several advantages accrue from learning to perform the other skills
automatically, without conscious attention to detail.
The skills involved in human communication, for example, require both motor
and mental activity: the motor activity of speech or gesture, and the mental
activity that formulates what is to be said. In the course of learning these skills,
an individual's performance can be improved incrementally through practice so
that the skills eventually can be performed without conscious attention to detail.
For example, in recalling words stored in the memory, the activity can be
performed without conscious attention to the details of how the words are
selected by the brain during the retrieval process.
To the extent that an individual can perform some mental activities without
conscious attention to detail, the conscious part of the brain is freed to attend to
other mental activities, thus enlarging its cognitive scope. Such enlargement of
human capabilities is attributable in no small part to the enlarged human
cerebellum and its contribution to the automation of mental activities, which
appears to have been a prerequisite for the emergence of human language.

Because such language confers a unique and inestimable advantage on humans,


the cerebellum can be regarded as an underestimated treasure submerged at the
bottom of the brain.
September, 1997
Further Reading on Current Concepts
Cerebellar Communications with the Prefrontal Cortex: Their Effect on Human
Cognitive Skills. Allan C. Leiner, Henrietta C. Leiner, Charles R. Noback.
Channing House, Palo Alto, CA 94301; Dept. of Anatomy and Cell Biology,
Columbia University College of Physicians and Surgeons, NY 10032.
"Prediction and Preparation, Fundamental Functions of the Cerebellum" by Eric
Courchesne and Greg Allen. Published in the journal Learning and Memory,
(Volume 4, No. 1, pages 1-35), Cold Spring Harbor Laboratory Press, Cold
Spring Harbor, NY, 1997.
"How Fibers Subserve Computing Capabilities: The Similarities between Brains
and Machines", by Henrietta C. Leiner and Alan L. Leiner. Published in the
book The Cerebellum and Cognition, edited by Jeremy D. Schmahmann,
(Volume 41 of International Review of Neurobiology), Academic Press, San
Diego, CA, 1997.
About the Authors
The Leiners, who started their professional careers as mathematicians, have spent the major
part of their careers on research and development of electronic computers. Henrietta C. Leiner
worked in the computer field at the National Bureau of Standards, Washington, DC until midcareer, when her interest in neural systems caused her to switch to the field of neuroanatomy,
which she studied at Columbia University in New York City. Alan L. Leiner designed electronic
computer systems at the National Bureau of Standards, Washington, DC from 1945 to 1 96Q,
when he joined the IBM T. J. Watson Research Center in Yorktown Heights, NY. He worked for
two decades at IBM in New York, Philadelphia, and Palo Alto, CA, after which he retired. In
retirement, for more than a decade, the Leiners have been engaged in an interdisciplinary effort
to analyze cerebro-cerebellar networks in the human brain, with particular emphasis on
cerebellar contributions to cognition.
To contact the Leiners, write to them at Channing House, 850 Webster Street 635, Palo Alto, CA
94301, Telephone and Fax: 650. 326-7424.

The Downshifting Dilemma:


A Commentary and Proposal
by Robert Sylwester

Metaphors are useful, since they connect complex concepts to understandable


objects and events. All metaphors (and also maps, models, and explanations)
contain a certain level of distortion. For example, the star constellations don't
represent actual stellar relationships, but rather provide a simple
location/direction model for non-astronomers. Metaphoric distortion is
acceptable as long as the metaphor adequately communicates the essence of the
concept, and users understand the nature of the distortion. Recent cognitive
neuroscience developments are altering our understanding of a variety of brain
systems and processes, and so it should come as no surprise that some of these
new understandings suggest that we'll probably have to redesign or even
abandon some of our long-established much-loved metaphors.
I believe that downshifting is an example of a widely-used metaphor that has
outlived its usefulness, because it doesn't adequately communicate current
understandings of how our response systems function. What follows, therefore,
is a critique of the metaphor, and a suggested alternative.
The Downshifting Metaphor
Downshifting is a mixed metaphor that emerged out of Paul MacLean's Triune
Brain model and a car's gear mechanism. It implies a three shift automobile
drive system. When I ask educators who use the metaphor to explain their
understanding of it, they tend to respond somewhat as follows: Low gear (the
reptilian complex) drives primitive/reflexive responses. Middle gear (the limbic
system) drives emotional responses. High gear (the cortex) drives
rational/reflective responses. Thus, someone who is currently functioning
rationally may be confronted by a difficult problem, and downshift to an
emotional or primitive response level (much as we downshift a car when we're
confronted
by
hills/mud/etc.).
When
questioned
further,
most
view downshifting as a negative action.
The key problem with this scenario is that our emotions are not a centralized
response system (as second gear implies), but are rather part of an extended
alerting or arousal system that establishes the emotional tone and bias of our
response to clear and ambiguous dangers and opportunities. Emotion is quite
transitory, mood might last for days, and temperament provides a lifelong
emotional bias.
Our several specific emotional subsystems alert us to dangers and opportunities
that shift our attention from its current focus to that of the emerging problem -and these actions then activate our various response systems (or as I've

previously written: emotion drives attention, which drives learning, memory,


problem solving and just about everything else). Since our emotions thus don't
respond, but rather simply establish and help to maintain the focus and
intensity of our attentional and solution systems, it's neurologically incorrect to
suggest that we downshift from a rational to an emotional response (second
gear).
A second problem with the downshifting metaphor is that only one car gear can
function at a time, and our brain is a marvelous parallel processor. So to use the
current car metaphor, I could be driving with friends and simultaneously carry
out all of the following: (1) automatically operate the car's navigation
mechanisms (low gear), (2) monitor a beautiful orchestral piece on the car
radio, and note dangers and opportunities in traffic patterns (middle gear), and
(3) carry on a thoughtful conversation with my friends (high gear). That's some
gear box! Downshifting implies to me that our brain functions in only one
response mode at a time, and it doesn't.
I indicated above that many folks view downshifting in negative terms, and that
creates a third problem. Each of our several response systems evolved to carry
out an important function, so primitive responses aren't necessarily negative.
We don't have to use good manners when our life is on the line. Most of our
responses to challenges involve simultaneous behaviors at several levels, and so
we ad hoc our way through life with regrets and apologies for acting too quickly
or, for delaying too long. Would that the cognitive line between rational and
irrational responses were so neatly drawn.
Consider the film, "Saving Private Ryan." It's apparent that the concept
of downshifting isn't up to the task of explaining the complex
emotional/attentional and primal-to-intellectual dynamic of the high-level
military/political decision to risk the lives of several soldiers in an attempt to
save the life of one. Was the decision good/bad, moral/immoral,
heroic/cowardly, rational/irrational?
Further, who were the good guys and who were the and bad guys in the film
(considering that the German military tended to send Polish and Estonian youth
to defend the dangerous positions that are the focus of the film)? We see what is
called downshifting throughout the film, and yet it seems an inadequate
metaphor for the complexity of the behaviors depicted.
Finally, the limbic system (which is central to the Triune Brain Theory that
sparked the downshifting metaphor) has recently come under increasing critical
assault (LeDoux, 1996. Brothers, 1997. Pert, 1997. Pinker, 1997). We now know
that our emotional system is neurologically widespread, although many of its
important functions involve structures (such a the amygdala) historically
associated with the limbic system. I've gotten a lot of mileage out of m index
finger/bagel/construction paper model of the Triune Brain, and I now think that
I finally better eat the bagel. 3
I'm not arguing that we can never consciously move from a rational to a
primitive response mode (as implied by downshifting). Continued frustration
with a persistent problem could certainly lead to a deliberately directed

primitive outburst, but primitive responses aren't generally deliberate. They're


stress-driven, and are typically precipitated by serious new information. This
means that the previous situation itself has now changed to a new situation -and this then-now separation finally suggests how we might begin to think of a
new explanation for what we formerly called downshifting.
A Proposal
Cognitive problems can arise out of external sensory information or internal
mental processes. Most incoming sensory information is initially processed
through the thalamus into two separate response systems:
1. We have a relatively slow, analytic, reflective (primarily cortical) system to
explore the more objective factual elements of a situation, compare them with
related memories, and then rationally respond. It's best suited to nonthreatening situations that don't require an instant response, life's little
challenges.
2. We have a fast, conceptual, reflexive (primarily subcortical) system that
identifies the dangerous and opportunistic elements in a situation, and then
quickly activates powerful innate or learned automatic response programs if
survival seems problematic. This fast stress-driven system developed to respond
to imminent predatory danger and to fleeting feeding and mating opportunities.
Our emotional/attentional systems thus are primed to quickly focus on (for
example) any loud, looming, contrasting, moving, obnoxious, or attractive
elements that signal potential danger, food, or mates, and to rapidly signal the
information to our solution systems.
The fast system thus enhances survival, and so it's the default or go-to system,
and not the one we downshift to. If anything, our response would typically begin
with this immediate reflexive response system and then upshift to a more
reflective response if it's apparent that the situation doesn't require an
immediate response (just as in a car, which almost always begins in low gear,
and then shifts up).
Unfortunately, the rapid superficial analysis of the fast system often leads us to
respond fearfully, impulsively, and inappropriately to situations that don't
require an immediate response. Stereotyping, prejudice, regrets, and apologies
are but four of the prices we humans continually pay for this powerful survival
system. Worse, the neurotransmitter and hormonal discharges associated with
fear can strengthen the emotional and weaken the factual memories of an event
if the stressful situation is serious and/or chronic. We become fearful of
something, but we're not sure why, so we've learned little from the experience
that's consciously useful (because a reflexive response functions unconsciously).
4
Further, chronic activation of our fear pathways can result in physical
deterioration within our memory systems. I suppose that it is these elements
that have led to the negative reputation that primitive responses seem to have in
the downshifting metaphor. But a primitive stress-driven reflexive response is
truly advisable in a situation that requires an immediate forceful response such

as sudden acceleration or rapid braking in response to traffic conditions. Our


stress system evolved to be used as a temporary rather than continuous
response system. It's like salt, a little bit is biologically useful; a whole lot is
generally harmful.
I don't believe that we must have a metaphor to describe our dual response
system. Why not just use the terms reflexive and reflective?
The explanation might go something like this: When our emotional/attentional
systems report a serious problem, our first line of either defense or attack tends
to be reflexive. Powerful reflexive response repertoires are unconsciously
activated. Our slower reflective problem-solving system is simultaneously
alerted, and it can soften or even override our reflexive system's response if it
can quickly come up with a better solution, or negotiate a delayed response
(such as to count-to-ten when anger flares). Conversely, we tend to activate our
slower reflective system to solve challenging problems that don't carry the sense
of immediacy that activates the reflexive responses that impede reflective
thought.
We thus have two excellent solution systems that can independently and
cooperatively respond to most of the challenges we face. I'm currently
typing reflexively (unconsciously, automatically) but I certainly hope
that what I type comes out of reflective (conscious, deliberate) thought
processes. Both processes are essential to the production of this commentary.
They are neither positive or negative. They exist to carry out different (but often
simultaneous) functions. When I was learning to type, I was very reflective and
deliberate about striking the keys. I consciously knew where all the letters were
on the keyboard, but I was slow and inefficient, because I couldn't
simultaneously (1) reflect on what I would write, and (2) type it. Today I don't
consciously know where any of the letters are, and I'm a fast efficient writer and
typist, because I can use my conscious reflective system to focus entirely on
thinking about what I'll write, and my unconscious reflexive system to
simultaneously carry out the appropriate typing actions.
The two systems also function equally efficiently in response to social
challenges. For example, a teacher could be reflectively working with a class and
suddenly be confronted by the inappropriate behavior of a student. She could
immediately reflexively respond to that student while continuing to function
reflectively with the rest of the class. There's no shifting, just a simultaneous
response to two different stimuli, something our parallel-processing brain does
with ease.
It's possible that a person biased towards reflexive responses may reflexively
respond to many situations that are better solved through reflection, and a
principally reflective person may likewise reflectively delay responses to
imminent dangers and opportunities. It's not a foolproof system.
Children must learn how to intelligently solve problems beset with obstacles,
and so they must come to understand, respect, and effectively use both of the
systems. The school environment and curriculum should enhance this learning

process by reducing the specter of threat when it doesn't enhance the reflective
learning process, which should be conscious and deliberate.
Fear is an important element of some reflexive learning, however, and so we
appropriately insert it into practicing automatic safety responses, such as fire
drills. However, fear doesn't enhance and isn't necessary in the mastery of most
school-related automatic skills (such as reading, computation, typing). Place
students in a positive challenging classroom, and even the youngest can
understand the value of practice activities that lead to the automatic mastery of
important things. They did practice endlessly and joyfully when they learned to
walk and talk.
I
would
therefore
suggest
that
we
simply
use
the
terms reflexive and reflective to describe our response patterns. Further, we
shouldn't think in terms of shifting back and forth between the two (as with a
gearshift), but rather realize that we tend to use the reflexive or reflective system
that's initially best suited to the current situation, and that we probably also use
both systems in a variety of currently ill-understood combinations to respond to
many of the problems that we face.
Reflexive and reflective are easily understood terms that are commonly used by
cognitive neuroscientists, and they don't contain the problems
that downshifting has. They work for me, and so perhaps also for you. The
option is for someone to come up with a new, good, scientifically appropriate
metaphor.

References
Brothers, Leslie (1997) Friday's Footprint: How Society Shapes the Human Mind. New York:
Oxford).
LeDoux, Joseph (1996) The Emotional Brain: The Mysterious Underpinnings of Emotional
Life. New York: Simon and Schuster.
Pert, Candace (1997) The Molecules of Emotion: Why You Feel the Way You Feel. New York:
Scribner's.
Pinker, Steven (1997) How the Mind Works. New York: Norton.
Sylwester, Robert (1998) Student Brains, School Issues: A Collection of Articles. Arlington
Heights, Illinois 60005-5310: Skylight Training and Publishing.

About the Author


Robert Sylwester is Emeritus Professor of Education, University of Oregon. You can reach Dr.
Sylwester at:
University of Oregon
College of Education
Eugene, OR 97403-5267
Phone 541-345-1452
FAX 541-346-5174
Email bobsyl@oregon.uoregon.edu.

Unconscious Emotions, Conscious Feelings,


and Curricular Challenges
by Robert Sylwester

Although it may irritate the teacher, one of the most intelligent questions a
student can ask is "Why do we have to do this?" Students (and the rest of us, for
all that) are loathe to expend cognitive energy unnecessarily, so assessing the
importance of a task is a key initial step in cognition.
We live in a complex space/time world replete with dangers and opportunities
related to our survival and reproduction. Two internal systems recognize and
respond to such challenges:

1. our skull-centered brain, composed of hundreds of billions of highly

interconnected neurons and glial support cells, integrates and responds


to the information from our sensory system, and
2. our diffused immune system, composed of an infinite number of (often
free-floating) cells spread throughout our body, responds to the several
pounds of invisible microbes and pollutants that inhabit our body.
So our very interconnected brain responds to the larger visible external
challenges, and our very diffused immune system responds to the tiny invisible
internal challenges. Scientists now realize that the two systems are highly
interconnected. A successful response to many of life's challenges requires the
two systems to collaborate, and illness can occur if they don't.
This article will focus on our brain's activation systems, on its unconscious and
conscious ability to recognize important dangers and opportunities. Such
recognition is admittedly a small part of a very complex cognitive system, but
it's critical to successful teaching and learning -- as teachers soon discover if
they don't implicitly or explicitly answer the question the student above raised.
Further, recent developments, such as the publication of Antonion Damasio's
acclaimed book, The Feeling of What Happens: Body and Emotion in the
Making of Consciousness (1999), are providing educators with an expanded
view of our recognition system and this poses important educational challenges.
UNCONSCIOUS EMOTIONS
Since our immediate environment is rich in dangers and opportunities that
range widely in importance, our brain needs something akin to a thermostat to
determine when a specific challenge reaches the threshold of being sufficiently

important to activate the several systems that focus attention and develop
appropriate responses. Emotion, centered principally in a small set of sub
cortical brain systems, is our biological thermostat, and so it's central to
cognition (and educational practice). Although emotion is embedded in our
language as a somewhat vague concept, recent scientific developments are
clarifying the terminology, and changing some previously held beliefs about its
biology and function.

1. Emotion is an innate, powerful, and principally unconscious process. It


has to alert us when we're asleep or attending to other things, but not
bother us with problems and processes that don't require conscious
attention. For example, emotion alerts us to an opportunity for food, but
it doesn't continually report on the digestive process that follows eating
unless the food turns out to be indigestible. Further, we don't
consciously choose to be emotionally aroused, and such arousal often
interferes with what we're currently doing. In effect, our emotions tell us
to stop doing what we're currently doing, and to attend this more
important challenge.
That dominance is possible because far more neural fibers project from our
brain's emotional centers up into the logical/rational areas than the reverse. A
sudden emotional stimulus can thus easily and immediately stop classroom
activity -- and it's then neurologically difficult to get students so rationally shut
off their emotional arousal and resume what they were doing. Effective
teachers, realizing that the disruptive emotional arousal will continue until the
problem is resolved, simply take the time to resolve it before resuming what
they were previously doing.

2. A click of the thermostat announces a furnace's imminent response to a


sudden drop in room temperature. Our mostly unconscious body
language similarly reports our current emotional state. It's obviously
useful for a social species to have a means of heralding the thrust of an
imminent emotion-driven behavior -- and so it makes a lot of sense for
teachers to become adept at reading and adapting to their students' body
language.

3. Emotion tends to respond most vigorously to high contrast information,

and to merely monitor or ignore steady states and subtle changes. This is
generally biologically sensible. Stay the course rather than expend
cognitive energy on things that aren't currently problematic and
fluctuating. Emotion can thus trick us into not recognizing the
subtle body language of a gradually encroaching problem until it
suddenly becomes menacing.

The recent spate of school killings came as a surprise to many educators and
classmates who had worked daily with the perpetrators, and hadn't suspected a
thing. We're similarly surprised when an unnoticed former student turns out to
be very successful. Recall our immune system -- tuned to tiny dangers and
opportunities that are invisible to our brain (which focuses on the visible
things). Do schools need a similar dual monitoring system that can recognize

both the manifest and masked dangers and opportunities that crowd the
corridors? Schools had such a system -- counselors and others whose principal
assignment was to move about the school (like immune cells prowling our body)
on the lookout for potential problems. In a reckless search for economy at any
cost, many schools dismissed them, or burdened them with assignments that
precluded their ability to do what they were trained to do. A tiny unnoticed
virus can soon immobilize an entire body. That's why we have both a brain and
an immune system. Institutions that ignore the little (but emerging) problems
do it at their peril.

4. Emotional arousal doesn't define or solve the challenge, but rather it


activates attentional and problem solving processes that develop the
response (our immune system, similarly separates the tasks of
recognizing and responding to our body's microscopic invaders). We
thus don't emotionally respond to a challenge, but rather, our emotions
alert us to its existence -- a subtle but important distinction. So although
our emotions play a necessary initial role in the shaping and eventual
solution of a problem (and may continue to arouse us throughout our
response, because it's generally important to maintain interest in the
challenge), they're not a problem-solving system.
Consider an emotionally arousing classroom game (such as a spelling relay) that
has no relationship to the skill being taught. The game artificially hypes student
emotion in an activity that probably wouldn't of itself arouse enough interest to
enhance learning. So even artificial emotion can activate attention, which
activates the cognitive systems that memorize the spelling of words. Ideally,
though, a teacher's emotional trigger to an activity should be related to the
nature of the activity, since this will provide a more easily remembered
emotional context for the future use of the learned material.

5. Although emotions don't solve our problems, they can bias the direction
of the response. Temperament is a seemingly innate element of our
emotional system that unconsciously predisposes emotional arousal
towards danger or opportunity. A person's temperament typically
centers somewhere along a continuum between bold/uninhibited and
anxious/inhibited (Kagan, 1994), with boldness being processed
principally in the left hemisphere, and anxiousness in the right
hemisphere (Siegel, 1999). When emotionally aroused, the bold thus
tend to be initially curious about a potential opportunity, and the anxious
wary about a potential danger. Temperament is thus a useful trait, since
it enhances a quick and confident move towards a response. Because we
frequently follow our temperamental bias, we tend to become quite
competent with it over time. Think of handedness, which similarly
develops exceptional competence with the favored hand.
Either temperamental bias can be useful, so students should be encouraged to
develop whatever nature has given them. But just as it's advisable to divide an
investment portfolio between conservative (danger) and risky (opportunity)
investments, so students should be given opportunities to experience salutation
that would move them away from their temperamental predisposition, and

allow them to practice their back-up system. Projects that effectively team
bold/optimistic students with anxious/pessimistic students can create a
cooperative forum in which the best elements of both approaches can be
synthesized into an effective solution. Indeed, people with different
temperaments often marry and create a successful family team if mutual respect
defines the relationship.

6. Emotions (like temperament) are neither positive nor negative in

themselves. They all evolved to alert us to specific kinds of problems, so


all are developmentally important. Just as theorists have proposed
several different ways of categorizing intelligence, so scientists differ
somewhat in their classification of emotions. Damasio (1999) lists
surprise, happiness, fear, anger, disgust, and sadness, as our primary
emotions; and embarrassment, jealousy, and guilt among our secondary
(or social) emotions. Note that most alert us to a negative situation.
We've tended to consider classroom emotional responses in negative
misbehavior terms -- but since all brain systems (including emotion)
must be developed, educators should now design appropriate
developmental environments for emotions tuned to negative situations.
We'll return to this issue later.

CONSCIOUS FEELINGS
Damasio (1999) suggests that feelings emerge in our brain when we become
conscious of our unconscious emotional arousal to a potential
danger/opportunity. As indicated above, emotions can often be publicly
observed, but our feelings remain a private mental experience of the emotion.
Feelings, which lead us to conscious thought and exploration of the current
challenge, are thus useful, since they allow us to go beyond innate programmed
behaviors, to rationally design solutions to a variety of contemporary challenges
that evolutionary development didn't cover.
Feelings allow us to step into the arcane world of consciousness, the mysterious
mental process that abandons us when we go to sleep, and magically reappears
when we awaken. Consciousness identifies the first-person-singular self that
philosophers, psychologist, and theologians have long tried to define. Not only
do I know something, but I know that I know it. So who is the "I" who is doing
all this knowing (and feeling)? Damasio draws on decades of neuroscience
research and the recent advances in brain imagine technology to suggest how
conscious processes could have emerged in our brain out of the unconscious
systems that regulate emotion. Since school activities focus principally on
conscious learning and behavior, an understanding of the biology of
consciousness will be essential to the development of credible theories of
teaching and learning.
Protoself In Damsio's theory, the biology of consciousness begins with a
neuronal arrangement that maps every part of an organism's body into one of
various interconnected brain areas. This mapping is necessary in all animals
because brain and body must constantly communicate in order to maintain a
continuously revised sense of what's happening throughout the organism.

A collection of automated brain systems that Damasio calls the protoself use this
continuous flow of information to manage various life processes, such as
circulation and respiration. The protoself maintains the stability it needs across
its lifetime by operating body systems within genetically established relatively
narrow regulatory ranges.
Core Consciousness: The Present But we're conscious of more than our
own self. Our protoself is imprisoned within the geography of its body, but
sensory/motor and related brain systems also allow a conscious organism to
explore the world. A stable body thus confronts a constantly shifting and
expanding external environment.
So not only does a brain contain a map of its body, but a conscious brain must
also have a mechanism for mapping and connecting to the external world.
Damasio believes that consciousness emerges when the mapped relationship
between organism and an external object (which may be another organism) has
risen to the level of a feeling of what's currently happening.
Core consciousness (which we share with many animals) is thus the
consciousness of the here-and-now -- a nonverbal imaged running account of
the objects an organism confronts in a series of successive instants as it moves
through and interacts with its immediate environment. Think of being both
actor and spectator in a movie within our brain (a film being a sequence of still
pictures that give the illusion of movement as they quickly pulsate through our
brain).
Many catch-phrases in our culture speak to the important of recognizing and
respecting the her-and-now in the quickly moving stream of consciousness that
defines much of life. Stop the world, I want to get off. Slow down and smell the
daisies. Seize the moment. Core consciousness is primal in that it continuously
focuses the organism on the immediate which
Extended Consciousness: The Past and Future. We may live in the
present, but we have lived in the past and we will probably live in the
future. Damasio suggests that organisms must have a large cortex in order to
consciously move beyond the here-and-now -- to profit from past experiences
and to avoid potential problems. The cortex must be sufficiently large to
contain a vast and powerful autobiographical memory that can quickly identify
the largest possible range of information relevant to a novel challenge.
Humans, and the great apes to a lesser extent, have such a cortex.
Intelligence emerges out of this ability to embellish and temporally extend core
consciousness. It allows our brain to manipulate recalled information in the
mental design and analysis of potential responses. The practical applications of
conscious intelligence include imagination, creativity and conscience -- which
led to language, art, science, technology, and a variety of cultural and political
systems (such as the shared government of a democratic society).
In laying out many of the specifics of the neurobiology of consciousness,
Damasio has established a credible framework that other theorist and
researchers will further explore. Indeed, as I write this, Gerald Edelman, who

won the Nobel Prize for his immune system discoveries, and neuroscientist
Giulio Tononi have just published their Darwinian exploration of consciousness
(2000). Expect other books by renowned scientists to follow, consciousness
being the Holy Grail of the neurosciences. This short functional synthesis of
Damsio's book couldn't hope to communicate the richness of his theory, and the
depth of his supporting scientific evidence. Read the book and be amazed.
We thus begin a new century with optimism -- knowing that a major
educationally significant biological mystery is moving decidedly towards
solution. Einstein's theories established the 1900's as the century of physics.
Biology should dominate at least the early part of this century -- and educators
are basically biologists. We can revolutionize educational policy and practice,
but only if we inform ourselves of the explosive educationally important
cognitive neuroscience developments that are occurring now, on our watch.
CURRICULAR CHALLENGES
This discussion has focused principally on our emerging understanding of our
critically important recognition and arousal systems -- from unconscious
emotion to conscious feelings. What do such biological developments have to
say to those who seek practical educational applications from such theory and
research? Three related issues seem promising -- and especially for imaginative
young educators in search of a career research agenda that's focused on the
search for appropriate educational applications in this area:
The Arts. I have argued that the arts play and important role in the
development and maintenance of our motor system, which processes the
concluding stage of most cognitive sequences (Sylwester, 1998). What role
might the arts play in the initiatory stages of cognition (recognition,
arousal)? Unconscious emotions and conscious feelings alert us
to biologically important dangers and opportunities. We're a social species, so
it would be advantageous for a society to have similar alerting systems
for culturally important dangers and opportunities that many people might
otherwise not recognize. An important segment of the arts, mass media, and
cultural organizations play such a social arousal role. This suggests that the notnecessarily-nice-and-pretty arts are an integral part of the social equivalent of
our very important emotion/attention system. If so, it's folly to reduce school
programs that help students to understand the often critical role of social
arousal systems, such as the arts. It also suggests that if the arts are important
to the development and maintenance of the systems that initiate and conclude
cognitive activity, they're probably also important to the robustness of the
systems that process the several intervening stages. The arts, like
consciousness, have been an enigmatic element of human life -- and like
consciousness, they now appear to be researchable in ways not possible during
the past century.
Play. Extended consciousness requires a large cortex, and this creates a birth
canal problem (that all mothers understand). Humans are consequently born
with a very immature brain (13 its adult size) which develops over a long
sheltered childhood. Most animals are born with a substantially developed
brain, and so they're on their own shortly after birth. Their survival is thus

dependent on a large number of innate (rather than learned) brain systems that
respond automatically to the dangers/opportunities their species typically
confronts.
Sheltered from the need to protect and support themselves, juvenile humans are
free to use play to consciously explore the dynamics of and alternate solutions
to pretend problems that they devise -- typically childhood versions of various
adult problems they will later confront. But how does a brain unconsciously
generate the requisite emotional arousal without the presence of real danger or
opportunity? Good games do that at a pretend level that extended
consciousness permits, and so much informal childhood motor, language, and
social learning develops easily and without much adult instruction through
play/games/contests that can spark emotional arousal (which then activates our
attention, problem solving, and behavioral response systems).
Further, we continue to use play and games throughout life to maintain the
robustness of our emotional arousal system. Cognitive systems that aren't
continuously used weaken through a use-it-or-lose-it principle, and the reality is
that our primary emotions (surprise, happiness, fear, anger, disgust, and
sadness) typically aren't frequently activated in a consciously controlled real
life. But a game, such as basketball, will frequently and unpredictability (and
perhaps artificially) activate all emotions of players and spectators over its
course, and that accounts for much of the appeal of the game.
Play and games are thus joyfully important emotion/attention machines that
can enhance the quality of a sheltered child's extended learning. It's only in
school that we refer to learning as work. In an era obsessed with assessment
and standards, educators must rediscover the power that play has to activate
and enhance learning. Teachers have always used learning games, but emerging
electronic technologies are creating amazing new instructional possibilities that
are replete with both danger and opportunity. For example, the Fast ForWord
program successfully uses variable speed video game technology to eliminate
specific auditory attentional deficits that negatively affect language development
in young children. Play poses truly exciting challenges for imaginative
educational theorists and researchers.
Classroom Management. The ability to recognize inappropriate social
behavior is a key developmental skill. Misbehavior is to a classroom what pain
is to a body -- a useful status report that something isn't working as it should.
Damasio suggests that pain isn't an emotion, but rather a local tissue
dysfunction that may or may not activate an emotion. Teachers and students
similarly respond to or ignore a variety of behaviors during a school day.
We've tended to view classroom management as an institutionally directed
means to a compliant efficient teaching/learning environment. The assumption
is that the students are the problem, and yet there's often as much institutional
as student misbehavior in a classroom. So why do the educators get to make all
the misbehavior and management decisions? Is it possible to think of classroom
management as a key collaborative element of the social curriculum? It would
certainly be a revolutionary challenge, but I believe that our emerging

understanding of cognition and consciousness suggests that imaginative


teachers can and should attempt it (Sylwester 2000).
Most curricular information comes from beyond the classroom. Conversely,
classroom management must recognize and respond to internal social dynamics
and tensions, and so it provides students with marvelous opportunities to
consciously confront and solve real social problems in an institutional
environment that differs from the informality of their home and neighborhood.
We're a conscious, social species, and we're 200+ years into the creation of a
democratic society that depends on the mastery of social skills and collaborative
behavior. And yet, by defining classroom management as something teachers
do to students, we've ignored the best available laboratory for helping students
to consciously and collaboratively develop social and democratic skills. Many
adult social problems have classroom parallels. When the perspective shifts
from behavior management to curricular laboratory, misbehavior shifts from
being only a negative danger to also being a positive opportunity for teacher and
students to recognize and collaboratively solve real current classroom
problems. How much current misbehavior emerges from student anger at
having no voice in what occurs in a classroom?
One obvious problem is that the social behavior of immature students isn't
exemplary. But neither is anything else when they begin the learning process.
We realize that crawling leads to toddling leads to walking leads to running, and
so allow the process to develop naturally (and not by posting rules and giving
lectures on how to do it). Conversely, schools unrealistically expect exemplary
institutional behavior from day one. Outside of school, children learn social
skills through the emotionally stimulating pretend problem that play engenders,
and they gradually and informally resolve the problems that emerge. The
classroom-as-laboratory would enhance that informal process by explicitly and
experientially developing the requisite social skills.
A social skills curriculum grounded in collaborative classroom management
would develop behavior recognition, data-gathering, analysis, and negotiation
skills. Conceptually, the social management of a classroom involves the same
elements as those involved in the biological management of one's body:
decisions on energy expenditure, space, time, movement, the biologically
possible and culturally appropriate range of behavior. It's nothing really
complicated, and nothing that students should not explore and master. But, it's
a scary paradigm shift in an era obsessed with school efficiency and assessment
-- and possessed of an irrational belief that if we tightly control the behavior of
K-12 students for 12,000 hours, the result will be the sudden mastery at 18 of
the social skills a democratic society requires of its adult citizen.
John Dewey began the 20th century with a philosophic plea that schools get
serious about becoming laboratories for the development of the democratic
skills our society requires (1916). He was ignored. Another century has now
passed, and we know much more about development, cognition, and
consciousness than Dewey could have imagined. What we now know suggests
strongly that Dewey was also biologically correct. So will we wait another 100
years?

References
Damasio, A. (1999) The Feeling of What Happens: Body and Emotion in the Making of
Consciousness. New York: Harcourt Brace.
Dewey, J. (1916) Democracy and Education. New York: Macmillan.
Edelman, G. and G. Tononi (2000) A Universe of Consciousness: How Matter Becomes
Imagination. New York: Basic Books.
Kagan, J. (1994) Galen's Prophesy: Temperament in Human Nature. New York: Basic Books.
Siegel, D. (1999) The Developing Mind: Towards a Neurobiology of Interpersonal Experience.
New York: Guilford.
Sylwester, R. (2000) A Biological Brain in a Cultural Classroom: Applying Biological
Research to Classroom Management. Thousand Oaks, CA: Corwin.
Sywlester, R. (1998, November) Art for Brain's Sake. Educational Leadership, 56 (3) 31-35.

About the Author:


Robert Sylwester is an Emeritus Professor of Education at the University of Oregon in Eugene,
Oregon. You can reach him at bobsyl@oregon.uoregon.edu.

Language Learning Impairment:


Integrating Research and Remediation
by Paula Tallal, Ph.D.
Two independent research studies report that language learning impaired (LLI)
children improved by approximately two years after only four weeks of intensive
exposure to speech and language listening exercises presented with an
acoustically modified speech signal, together with a new form of adaptive
computer training (Merzenich, 1996; Tallal, et al., 1996).
These papers report the development of novel remediation techniques that grew
out of a collaboration between Drs. Paula Tallal and Michael Merzenich.
Research by Tallal and colleagues has demonstrated that LLI children require
hundreds of milliseconds (msec) between acoustic events to discriminate
between them, while children of the same age and intelligence level only need
tens of msec. We have further shown that:

This basic sensory integration deficit interferes with the ability of LLI
children to discriminate the critical brief acoustic cues within syllables
and words that distinguish one phoneme from another, and
That temporal integration thresholds are highly correlated with the
degree of receptive language impairment in younger LLI children, and
later with these children's difficulty learning phonological decoding skills
for reading.

The work of Dr. Merzenich is based on neurophysiological studies with


monkeys. Dr. Merzenich and colleagues' landmark studies shows that even in
adult animals, the neurons in the brain that map sensory events are highly
"plastic." That is, the brain can reshape its neural circuitry in response to
environmental experiences. Specifically, they showed that through intensive
training, monkeys could gradually improve their identification of faster and
faster sounds. When they analyzed the brains of these monkeys, they found that
specific auditory regions had reorganized and significantly expanded their
neural circuits.
Merzenich and Tallal hypothesized that the training exercises used for neural
plasticity studies with monkeys might be adapted to alleviate some of the
sensory processing constraints that have been reported in LLI children. In our
earlier research, we had also observed that extending the duration of the brief,
rapidly changing transitional elements within the acoustic wave form of speech
syllables resulted in significantly improved speech discrimination of those

syllables for LLI children. We hypothesized that if we could create a computer


algorithm to acoustically disambiguate the rapidly changing acoustic cues
within ongoing fluent speech, this might help LLI children process, and thus
consistently represent, phonological cues in syllable, word and sentence context.
Once represented correctly at a high level of accuracy with modified speech, the
goal would be to drive processing through successive adaptive training, closer
and closer to normal levels. To test these hypotheses, we developed two basic
methodologies:

Computer "games" designed to adaptively speed up the rates of


processing of acoustic cues within both non-speech and speech stimuli,
and
Listening exercises designed to explicitly train on-line phonological
discrimination and language comprehension using acoustically modified
speech.

Computer games were developed using both nonverbal and verbal stimuli. The
games were adaptive. By adaptive, we mean that the stimulus sets and series of
trials were controlled by each subject's trial by trial performance. The adaptive
computer games were developed with the aim of first establishing the precise
acoustic parameters within stimulus sets required for each subject to maintain
80% correct performance on that stimulus set. Once that threshold point was
determined for each subject, the subject's own performance determined the
acoustic parameters of each subsequent trial. The goal of the training was to
first determine the thresholds for specific acoustic variables and then, for
subjects with elevated thresholds, attempt to drive them to process closer and
closer to a more normal processing rate. The "games" were designed to be fun
for the subject and to maintain ongoing attention.
The papers in Science report the results of the first two studies we completed
using these new techniques. In each study, subjects were trained with the
acoustically modified speech and computer "games" three hours per day, five
days a week for six weeks. The results of Study One were dramatic. They
demonstrated that improvement in processing rate, coupled with training with
acoustically modified speech, resulted in significant improvement in speech and
language. From a clinical perspective, these exciting results clearly hold promise
for a new and highly effective intervention for children with central auditory
processing phonological analysis and/or language comprehensive problems.
Study Two replicated the results of Study One, and in addition included a highly
controlled additional treatment group, who received the same treatment, but
with normal rather than acoustically modified speech and without adaptive
training aimed at speeding up the rate of acoustic processing. As predicted,
improvement made by the modified speech training group was significantly
greater than that made by the control group that received essentially the same
training, but with natural, unmodified speech.
How do we think this new therapeutic program works? First, the adaptive
computer "game" training was shown to be highly effective in significantly
speeding up the rate of acoustic processing. Indeed, as a group, thresholds went
from 400 msec pre-training to 100 msec post-training. For some LLI children,

thresholds post-training approached the normal tens of millisecond range. We


anticipate that with longer training times and more trials, these new training
exercises may continue to drive these children's processing rates into the
normal range. The data from our first two studies demonstrate that faster
processing rates and improved speech discrimination abilities translate into
significantly improved ability to process the individual sounds within words - a
fundamental goal of both speech and language therapy for language impaired
children, as well as phonological awareness training for reading impaired
(dyslexic) children. These studies show that LLI children clearly are able to
better distinguish one speech sound from another. After training, these children
appear to have been able to set up distinct, (not "fuzzy") phonological
representations for each phoneme in their language. Further, the results of our
follow-up studies demonstrate that the improvements in phonological
processing and language comprehension are enduring. That means that the
child is now able to approach speech, language and reading with a neural
processing system more attuned to adequate on-line phonological processing,
that is, more like that of children who are learning normally.
From a clinical perspective, data to date, from over 4,000 children completing
this new training program, show that dramatic, replicable, ongoing
improvements in central auditory processing, speech and language skills can be
achieved in relatively short, but intensive program using these new
computerized training techniques.

References:

Paula Tallal, Steve L. Miller, Gail Bedi, Gary Byma, Xiaoqin Wang, Srikantan S. Nagarajan,
Christoph Schreiner, William M. Jenkins, and Michael M. Merzenich. "Language
Comprehension in Language-Learning Impaired Children Improved with Acoustically Modified
Speech." Science, 5 January, 1996, Vol.271, pp. 81-84.
LLI children received extensive daily training over a four week period, with listening
exercises in which all speech was translated into a synthetic speech processing
algorithm. They also received extensive daily training with computer "games" designed
to adaptively drive improvements in temporal processing thresholds. Significant
improvements in speech discrimination and language comprehension abilities were
demonstrated in two independent groups of LLI children.
P. Tallal, G. Saunders, S. Miller, W.M. Jenkins, A. Protopapas and M.M. Merzenich. "Rapid
training-driven improvement in language ability in autistic and other PDD children." Society for
Neuroscience, Vol. 23, p. 490
A report of major success in applying adaptive training procedures disguised as
computer games to 5- to 12-year-old specifically language impaired children. Seven
hierarchical exercises were designed to improve aural phonetic reception in these
children, and to generalize their improved aural reception skills to all aspects of
language. With training, speech reception was markedly clarified and language
comprehension thereby improved.
In extension, training was applied to a population of 28 pervasively developmentally disabled
(PDD) children (10 autistic; 18 NOS). Children worked at the same seven computer-guided
adaptive training exercises for 100 minutes per day for 20-60 days. Most PDD children made

major gains in acoustic and phonological reception and in language comprehension, as


measured by highly significant progressions in training exercise performance. Mean Z-score
improvements in standard pre- vs post-training of these abilities (e.g., Token Test, GFW) were
about 1.75. In parallel, CELF and TOLD language battery quotients improved by >1 SD in about
80% of trained PDD children. Interestingly, large improvements in both receptive and
expressive battery quotients were recorded; Z-score changes averaged 1.3 for receptive LQ's and
1.1 for expressive LQ's, respectively.
These studies show that major gains in language abilities can be very rapidly achieved in at least
most of these severely impaired children by computer-guided training targeting fundamental
acoustic and speech reception abilities. Research supported by Scientific Learning Corporation.
S. Miller, M.M. Merzenich, G. Saunders, W.M. Jenkins and P. Tallal. "Improvements in
language abilities with training of children with both attentional and language
impairments." Society for Neuroscience, Vol. 23, p.490.
A large proportion of language impaired (LI) children have attentional deficit disorders
(ADD) and hyperactivity (ADHD). This study examined whether there are differences in
the abilities of ADD-LIs, ADHD-LIs and LIs to achieve improved aural speech reception
and been successfully applied to LIs (Science 271:78-84,1996).
The study was conducted with 106 children who were identified as ADD and LI,
compared with nearly 400 LI children that had no ADD. Major and equal gains in
speech and language reception and usage were recorded in ADD-LI and control LLI
children, documented by progressions in performance recorded in 7 adaptive,
computer-based training exercises ("Fast ForWord" program), and by standard prevs
post-training tests of speech and language reception, comprehension, and usage.
Zscores improved by a mean of 1.6 with training for ADD-LI children on the GFW Test
of Auditory Discrimination; and by a mean of 1.6 on the Token Test. TOLD and CELF
language battery quotients improved in parallel: positive Z-score changes for receptive
quotients were > 1.2 and 0.9 for the TOLD and CELF; expressive language quotients
had average Z-score gains >0.9 and 1.0). Improvements on all CELF and TOLD
standard scores and quotients were significant for both ADD-LIs and LIs at p > 0.001.
Training-induced gains in speech and language for LI vs ADD-LI children did not differ.
Compliance at 100-minute-long daily training exercises over a 20-60 day long training
period was equivalent. Gains of ADHDs did not differ to ADDs without H, by all
measures.
Paula Tallal, Ph.D. and Michael Merzenich, Ph.D., et al. "Fast ForWord Training for Children
with Language-learning Problems: Results from a national field study by 35 independent
facilities." Paper presented at the 1997 annual meeting of the American Speech-LanguageHearing Association, Boston, MA., November 21, 1997.
Paula Tallal, Steve Miller, Bill Jenkins and Mike Merzenich. "The Role of Temporal Processing
in Developmental Language-Based Learning Disorders: Research and Clinical Implications." To
appear in Foundations of Reading Acquisition, Benita Blachman, Ed. Lawrence Erlbaum Assoc.,
Inc.

About the Author


Dr. Tallal is Professor and Co-Director of the Center for Molecular and Behavioral Neuroscience
at Rutgers University, Newark, New Jersey. Dr. Tallal can be reached at 197 University Ave.,
Newark, NJ 07102. (201) 648-1080. Via e-mail at tallal@axon.rutgers.edu.

Active Research Leads to Active Classrooms


Kathie F Nunley
from NASSP's "Principal Leadership," March 2002, pg. 53-61.

Doing your own brain-research


There is a cautious whisper circulating through the educational community that we
educators shouldn't be too quick to jump on the brain-based education bandwagon.
What we need to do is wait. Wait for neuroscientists to tell us how all this new brain
research applies to the classroom.

What educators don't realize is that neuroscientists don't know where to start. They are
not teachers. Neuroscientists are not in the classroom. They do not know the questions
we want answered. We as educators, need to tackle our most cherished classroom
questions head-on. The technology is here. The need to know is now.

We are doing just that here in Salt Lake City. As teachers, we have teamed up with the
neuroscientists who pioneered magnetoencephalography (MEG). MEG works by
measuring the tiny magnetic fields outside the head created by the electrical activity
occurring inside the working brain. MEG allows scientists to see brain activity in both
time and space. This means that not only can we see the area of activity and we can now
see the sequence of activity. For the first time ever, we can watch the actual processing
of brain activity almost neuron by neuron.

We are seeking answers to three of education's most pressing questions - Are students'
learning style preferences visible in the way their brains process information? What are
the effects of classroom stress on learning? How do extrinsic rewards effect the learning
process?

Our work began when the two pioneers in MEG technique, William Orrison and Jeff
Lewine, brought their work to the University of Utah's research park in 1998. They
established a center for MEG clinical work called the Center for Advanced Medical
Technology (CAMT). Upon hearing about this new technology, my teaching colleague,
Gene Van Tassell and I saw a potential opportunity to research the learning preferences
of our students. As educators we were excited about the possibility of using the MEG

technique to "see" how our students process what is presented to them. Could we watch
them actually think and learn?

To begin our project we recruited subjects from our school district through newsletters
and PTA publications. Through newspaper articles and a website, we widened our
subject search throughout the state. The adolescents, aged 13-19, are first screened for
their learning style preference. Students are administered the Dunn, Dunn, & Price
Learning Style Inventory (LSI). Although, this LSI uses several categories of style, we
categorize students based on auditory and visual preferences only. After the paper and
pencil test for learning style, the subjects are sent to the CAMT at Research Park for the
MEG imaging test. During the MEG test, students are asked to perform various learning
tasks. Some tasks required them to listen to information (process auditory stimuli), other
tasks asked them to look at pictures (process visual stimuli) and some tasks asked them
to process both stimuli simultaneously.

After the MEG testing is done, the neuroscientists at the CAMT give us a picture of our
students' brain activity. Squiggle lines indicate electrical activity in 122 areas of the
cortex as detected by the sensors in the MEG during the testing. The activity can be
stopped at any fraction of a second in time by taking a "picture" of current activity. The
larger and more erratic the squiggles, the more activity in that particular area.

Preliminary results have indicated several important discoveries. Although we screened


hundreds of subjects, we had to eliminate subjects with any head trauma or emotional
disturbance (such as depression). As we learned, any trauma, such as a three year old
falling out of a wagon and hitting his head, could mean significant plasticity in the
brain, thus distorting normal processing. So, we discovered that "normal" brains are
hard to find.

During the MEG scan, if a subject was able to process both auditory and visual stimuli
simultaneously (as shown by having electrical activity on the MEG scan in both
regions) we determined them to have no sensory preference. However, in some subjects,
when presented both stimuli simultaneously, their brains only processed one stimuli.
The MEG showed activity in only one sensory region, the other region's activity was
flat. These subjects were considered to have a sensory preference. For example, in one
test, a 16 year old male student was given information visually on a screen and aurally
through headphones at the same time. The MEG picture showed lots of brain activity in
the occipital region in the back of his cortex (visual area) but no activity whatsoever in
the temporal region(auditory area). Apparently, at the instant that his brain was
receiving information from both eyes and ears, this student's brain did not process the
auditory information at all, only the visual. So the brain showed a preference for visual
information.

Of the 25 subject whose brain images have been interpreted so far, we have the
following MEG results:
10 subjects with a visual preference
1 subject with an auditory preference
14 subjects with no preference.

This suggests there may be a pre-wired sensory "preference" in some students' brains. In
some people, the brain may prefer auditory information so that it takes priority over
visual information. These may be the same type of people that have a hard time reading
while background noise is present. In this type of learner, the brain gives priority to
auditory information so it is hard to filter that out in order to concentrate on the visual
information in reading. Other students' brains show a preference for visual information.
These students may be the ones who can easily read with background noise present.
Their brains have no problem giving preference to visual information. However, these
students may have a hard time blocking out visual information in order to listen to a
lecture.

Thus far in our research, nearly half of the students' brains show a sensory preference.
Some have a stronger preference than others. Obviously, diversity exists in how fast
students are able to shift back and forth between processing visual and auditory
information to make sense out of any situation. Most of us have experience with
students who have a very difficult time processing visual or auditory information
quickly.

Another result from our project is that although these preferences vary from student to
student, they do not necessarily match with their paper-and -pencil learning style test.
When comparing the LSI results with the MEG results we have the following matrix:
Preferences

LSI Visual (6)

LSI Auditory (6)

LSI No preference (13)

10 with Visual MEG

1 with Auditory MEG

14 with no MEG pref.

The above matrix can be interpreted as follows:


Of the 10 adolescents which the MEG showed to have a visual preference, the LSI
found 1 to have a preferred visual learning styles, 3 to have an auditory learning style,
and 6 had no preferred learning style.

We have found no correlation between MEG sensory preference results and learning
style results as measured by the LSI. A student's LSI may show that they are auditory
learners but the MEG may indicate a visual preference or vice versa. It may be that the
brain's sensory preference is not the same thing as learning style. Learning style
generally includes social and emotional aspects of learning rather than the biology of
the brain. Paper LSI tests usually rely on students' self-report of their learning
preference. The MEG looks only at the physical brain response without regard for social
and emotional environmental preferences. So the MEG results suggest that students may
not necessarily know their brain's preference for processing sensory stimuli.

Previous to this type of research and these types of brain-imaging techniques, educators
were forced to rely on anecdotal information for what we know about how students
learn. This no longer needs to be the case. We now have physical evidence of diversity
in how students learn. Neuroscientists are looking for more areas to apply their
techniques. Education is an excellent area for application. However, progress requires
that educators take an active role in the research process. Following any research in
brain-imaging, educators must then take their findings back to the classroom for
practical application.

Applying the research to your own classroom


How have we applied these latest MEG sensory preference findings? First, by thinking
about how classrooms present information in both visual and auditory forms. Unless
students have their eyes shut during a lecture, they are receiving sensory information
through both senses. In students whose brains "prefer" visual stimuli, the information
coming from their eyes may mask the information coming from their ears. So the
lecture may be weakened by extraneous visual stimuli around the room or strengthened
though visual displays pertaining to the lecture.

Because self-reports may not be valid and school systems do not have MEG machines
available for teacher use, assessing students' learning preference does not appear to be

practical. Therefore, teachers must make sure that instructional materials are available
for every type of learner that may be in the room. We have realized the "my way or no
way" type of teaching will not work in a general, mixed ability classroom. Traditionally,
many teachers thought the problem was that students just needed to try harder. It
appears that "trying harder" is not the answer for students, but for teachers.

We need to try harder to accommodate the diversity of our students' learning


preferences. Most of us have known for years that there are no regular students in
regular education. Therefore, the movement toward whole-class curriculum
modification appears to be an answer for teaching in a heterogeneous classroom.

Based on what education has extracted from brain research, and supported from our
current project, I developed one such whole class curriculum method I call Layered
Curriculum. I call it layered because it divides the level of study into 3 layers, A, B and
C. In my classroom, students choose from a variety of assignments, a variety of
textbooks, a variety of hands-on materials.

The bottom layer, called the "C" layer, allows students to collect information on a topic
from a variety of student-chosen material. They pick and choose from approximately 20
assignment choices all worth varying points. Assignments include videos, bookwork
from a variety of text, magazine articles, posters, models, flashcards, and computer
work. Now if Jose' learns best from hands-on models, and Sara learns best through
reading, both students can learn in their preferred method.

All grading or assessment at this layer is done through oral defense. Every assignment,
whether bookwork, flashcards, videos, posters, models, or computer work has an oral
quiz, one-on-one between teacher and student. I can move quickly around my room
during every class period and spend a few minutes with each student to check for
comprehension, correct errors in their thinking and help direct their individual learning.
I get personal face time with every student every day. The students get individualized
help for student-chosen assignments. (For more information on oral defense see my
article, In Defense of the Oral Defense, in February 2000 ASCD's Classroom
Leadership.)

The middle layer called the "B" layer in Layered Curriculum asks students to apply
what they've learned in the "C" level. Here again, students are given choices in how to
apply, create or discover more information but this time, of their own design. In my
biology classroom this is done by providing questions for which students must find an
answer through a lab of their own design. I give several questions for them to choose
from and they must find the answer.

The top or "A" layer requires a critical analysis on a topic in the unit. Students must
research one of several topics, summarize their research and form an opinion on the
issue. I list several controversial topics from which they choose one.

Grades are based on how students successfully complete the C, B, and A levels. Grading
criteria for each type of assignment is posted on the walls of the room so that students
are clear on expectations ahead of time. It is a completely student-centered environment
and students are in control of their own learning and responsible for their grade.

I have used Layered Curriculum in my classroom for several years now and it has
proved to be a very effective way to personalize instruction. Two years ago I taught 3
periods of general biology using Layered Curriculum and 3 periods with my old
teacher-centered method based on the textbook. The Layered Curriculum periods had
less than half the number of student failures than the other periods. Aside from reducing
the number of failures in my general biology classroom, Layered Curriculum
dramatically increased the number of students on task in all my general level classes.
Several teachers in my school and now several schools around the county have
implemented Layered Curriculum in a variety of subjects and have reported similar
results. Teachers continue to use it for three main reasons - it reduces the number of
student failures, it increases student involvement (time on task) and it reduces classroom
management problems.

Educational leadership today means educational research


Our research continues today, both in the classroom and at the MEG facility. As we
finish our current focus on learning preferences, we want to examine our other questions
regarding stress and extrinsic rewards. Being part of a unique team of educators and
neuroscientists has energized my passion for individualized education in the classroom.

One thing all the research seems clear on - students are all different. Not just on the
outside, but the inside as well, including how their brains processes the information we
present to them. The more we learn, the more we realize that classroom instruction must
be individualized. Information needs to be presented in a variety of ways in order to
ensure that every student has an equal opportunity for success.

The research is still not clear on many issues. What is the ideal classroom environment
for learning? What effect do the popular punishment-based classroom management

programs have on the learning climate and student violence? What can we do to further
facilitate learning in all students?

We cannot wait for neuroscientists to tell us the answers. We must join with them to
create teams using the latest technologies to improve the lives of students through
active, practical research that can be applied back to our classrooms.

More information on the MEG/education research project can be found at:


http://Brains.org
More information on Layered Curriculum can be found at:
http://Help4Teachers.com

Kathie F. Nunley is an educational psychologist, author, researcher and speaker living in


southern New Hampshire. Developer of the Layered Curriculum method of
instruction, Dr. Nunley has authored several books and articles on teaching in mixedability classrooms and other problems facing today's teachers. Full references and
additional teaching and parental tips are available at: http://Help4Teachers.com Email
her: Kathie (*at*) brains.org

Rolling Waves and Arctic Icecaps in the Sleeping Brain: Oscillation States Switch
by Chengyu Li, Ph.D
Can you imagine crashing waves freezing instantly into arctic icecaps-within your
sleeping brain? Well, maybe not, but from the question you might be able to infer that
the sleeping brain is far from tranquil. It's neural activity switches back and forth
between states of crashing waves and frozen icecaps. This analogy may be confusing, so
allow me walk you through the brain's ocean waves first.
When researchers first discovered how to measure electrical signals on the skull, using
electro encephalogram recording (EEG), people immediately noticed that the sleeping
brain is not resting at all (in the sense of brain activity). When a human or other
mammal falls into sleep, electrodes recording from the skulls surface will start to show
what is termed 'slow-wave' activity. Large in amplitude, low in frequency (hence the
name "slow-wave"), this type of activity can be seen over a large part of brain skull. As
one falls deeper into sleep, this slow-wave activity appears more frequently, up to one
cycle per second (1 Hz). We call this part of sleep "slow-wave sleep." Because slow
wave can be characterized by highly correlated neural activity, it is also referred to as
the "synchronized state."
However, around 90 minutes into the sleep cycle, dramatic events occur-the brain
freezes, at least in terms of large-scale oscillations-and the big, slow waves over the
entire brain are nowhere to be found. Instead, high-frequency, small-amplitude activity
prevails, with greatly reduced correlation. This phase is appropriately referred to as the
"desynchronized state." In this phase, our eyes move rapidly back and forth in what
scientists call "rapid eye movement" (REM) sleep. Most of our dreams happen during
REM sleep and, in adults, this activity occupies about 20-25% of the total duration of
sleep (or about 90-120 minutes per night). However, REM sleep occurs over
approximately four or five cycles throughout the night, with each cycle being slightly
longer than the previous one. Slow wave oscillation and memory consolidation
Why do we care whether the brain is in a slow wave or REM state? One reason is that
controlling sleep stages is critical for narcolepsy, a strange sleep disorder that affects
about one out of every 1,000 people in the United States. Why is it important for the
other 999 people as well? The answer to this question is actually one of the most active
research areas in neuroscience: Slow wave oscillation appears to contribute to memory
consolidation. That is, without slow wave sleep, memories don't get "burned" into your
brain.
We typically feel best after a good night's sleep, so you may not be surprised to know
that sleep can improve results in memory tests. Indeed, there is a long list of studies
demonstrating the importance of sleep, especially slow wave sleep, to memory
consolidation. You may be surprised to learn, however, that applying current your
forehead at the frequency of slow-wave oscillation can selectively improve performance
on so-called declarative memory tasks, where you are required to answer "fact" type
questions.
An amazing study demonstrating this phenomenon was done by Marshall, Helgadottir,
Molle and Born, at Germany's University of Lubeck in 2006. The researchers asked

human subjects (13 lucky medical students) to remember the English translation of
some German words right before going into sleep. During the first phase of slow-wave
sleep, the experimenters induced slow-wave activity by applying oscillating currents (to
the forehead) with the right frequency (0.75 Hz). The next morning, after the subjects
woke up, they were asked to try to remember the words they had learned the previous
day. The students who had the currents applied to their foreheads were able to remember
more.
The authors went on to show that applying current during REM sleep phase did not
improve memory performance. Also, applying the wrong frequency-for example, 5 Hz,
which normally can be seen during REM sleep-didn't have the same effect.
Clearly this finding raises many questions about both theory as well as practice.
Theoretically, this is one of the best examples showing that distinct brain oscillations do
have sufficient effect on specific brain functions. Equally important, it provides new
approaches to answer the question of how our brains consolidate memories. Practically,
I can almost imagine that some people will start to apply currents to their heads right
before a final exam or other important memory test! Maybe we will soon see some
"MemoImprov" devices selling on eBay. I guess you need to make sure that you don't
crank up the potential too much and fry your brain!
Ok, but why am I interested in this topic? One of my own research projects is looking at
the synaptic learning rule, which is important for memory, in different oscillation states.
How are slow waves and REM brain oscillatory states generated?
Hopefully I have persuaded you that slow wave vs. REM brain states do make a
difference. If so, the logical next step will be to learn how different brain oscillation
states are generated, maintained, and switch between one another. Only after we
understand these can we start to answer the theoretical and practical questions
mentioned above.
Oscillation is everywhere-clock clicks, the light we see, the sound we hear, and even our
daily life experiences are all oscillations. To me, the analogy of the ocean can still help
to illustrate brain oscillation. Water and ice are both made of H2O but exist in two
totally different states. The difference between liquid water and solid ice is determined
by certain parameters, like temperature or pressure. When H2O is in the liquid state,
like on the beach of Hawaii, crashing waves can be formed. However, on an ice cap in
the Arctic, the ocean is covered by solid state H2O which only allows for small
vibrations of the ice.
Now, the same brain consisting of the same cells can exist in different states, which are
distinct like water and ice (both made of H2O). The basic building block of the brain is
the neuron, which uses electrical and chemical signals to carry information. In each
wave of slow-wave oscillation, neurons elevate their activities in a plateau form, termed
the "UP" state. Just as water molecules influence nearby water with chemical forces,
neurons can also change the activation level of other nearby neurons by specific
connections called synapses. The waves can propagate across large regions of the brain
by the collective action of many neurons through synapses. Following closely after the
UP state, however, a silent phase follows, called the "DOWN" state. Each neuron only
has about 20 millivolts (one thousandth of a volt) difference between UP and DOWN

states. It may sound small, but because nearly all of the millions of neurons in brain take
part in the oscillation between UP and DOWN states, the summation of the UP/DOWN
states can be measured through the skull as EEG signals (with each state being very
distinct from the other).
However, in REM sleep, all the neurons essentially stay in the same activation state,
namely the UP state; one cannot find any large-scale propagation of activity or cycling
between UP/DOWN states. There is still high frequency activity, usually in a local area,
but this is drastically different from the global, synchronized slow-wave.
What state your brain is in largely depends on chemicals called "neurotransmitters" that
are secreted by neurons. These neurotransmitters change the activity of neurons directly,
and subsequently cause neurons to influence one another differently. A large amount of
research has been published on this topic, so it would be difficult to provide a detailed
background here. Instead I will provide an example of one informative study about the
controlling of brain state, as I did for slow-wave oscillation and memory consolidation
above.
The study was carried out by Lu, Sherman, Devor, and Saper at Harvard Medical
School and Hebrew University in Israel. In 2006 they found a flip-flop switch in a brain
region known as the brainstem, which lies near the back of the brain and just above the
neck. The researchers found that there are two tightly coupled groups of neurons
controlling the oscillations between slow-wave and REM sleep. They call them "REMon" and "REM-off" areas. These two regions exert mostly inhibitory influence on one
another. The REM-on area also contains excitatory neurons that influence other brain
regions that regulate EEG of REM sleep, as well as brain regions controlling muscle
activity during REM sleep. The mutually inhibitory interactions of the REM-on and
REM-off areas may act like a flip-flop switch that sharpens state transitions.
Such a tight mutual control of REM-on/off areas is why the finding is so important. It
means that even a tiny bit of imbalance between those two areas could switch the entire
brain state from one to the other. Why slow-wave and REM? Why sleep at all?
Although we now know a lot about brain oscillation in sleep, pressing questions remain
unanswered. In slow-wave sleep, most of the neurons will participate in the slow-wave
oscillations. In REM sleep, most of the neurons stay in the UP state. But why? We don't
need to respond to the outside world during sleep, so why waste precious energy on an
UP state? Furthermore, we all spend about one third of life asleep. Is it as big of a waste
of time as it seems? Even worse, sleep seems to be dangerous since we are basically
helpless to respond to threats, and when we lived in the wild, this could be fatal.
However, given all these hard questions, evolutionarily sleep is still conserved
throughout the animal kingdom. So much so, that even a kind of earthworm (C. elegans)
has been found to sleep from time to time. To me, sleep and the oscillation states must
play important functions in the success of animals-we just need to find them! Memory
consolidation is a good one, but we will need to know more in order to sleep without the
guilt of wasting another night. Until that time, though, let's continue to enjoy the
comfort of a good night's sleep.

Chengyu Li has a Ph.D. in neuroscience from the Institute of Neuroscience, Chinese


Academy of Sciences, Shanghai. Li has done post doctorate work at the University of
California, Berkeley, Howard Hughes Medical Institute. Li's research is partly funded
by Temporal Dynamic Learning Center (TDLC). Li's main research interests are in
synaptic plasticity, learning, memory, and brain states.

Você também pode gostar