Escolar Documentos
Profissional Documentos
Cultura Documentos
Review
Two Steps Forward, LLC, 3199 Bookham Dr., Sun Prairie, WI 53590, United States
Department of Social and Behavioral Sciences, Florida Gulf Coast University, 10501 FGCU Blvd South, Fort Myers, FL 33965-6565, United States
c
Department of Veterinary and Comparative Anatomy, College of Veterinary Medicine, Washington State University, Pullman, WA 99164-6520, United States
b
a r t i c l e
i n f o
Article history:
Received 4 September 2010
Received in revised form 23 April 2011
Accepted 3 May 2011
Keywords:
Affective sounds
Emotion
EEG
Alpha blocking
Theta
Gender differences
Auditory perception
Event-related cortical desynchronizations
and synchronizations
a b s t r a c t
At present there is no direct brain measure of basic emotional dynamics from the human brain. EEG provides non-invasive approaches for monitoring brain electrical activity to emotional stimuli. Event-related
desynchronization/synchronization (ERD/ERS) analysis, based on power shifts in specic frequency
bands, has some potential as a method for differentiating responses to basic emotions as measured during
brief presentations of affective stimuli. Although there appears to be fairly consistent theta ERS in frontal
regions of the brain during the earliest phases of processing affective auditory stimuli, the patterns do
not readily distinguish between specic emotions. To date it has not been possible to consistently differentiate brain responses to emotion-specic affective states or stimuli, and some evidence to suggests the
theta ERS more likely measures general arousal processes rather than yielding veridical indices of specic emotional states. Perhaps cortical EEG patterns will never be able to be used to distinguish discrete
emotional states from the surface of the brain. The implications and limitations of such approaches for
understanding human emotions are discussed.
2011 Elsevier Ltd. All rights reserved.
Contents
1.
2.
3.
4.
5.
6.
7.
Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Electrophysiological correlates of brain affective processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Classications of emotions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Affective communication versus induction of emotion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Measuring affective communication in the brain . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.1.
Spectral power change . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.2.
Event related potentials . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
5.3.
The ERD/ERS algorithm . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Summary of EEG studies and emotions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
ERD methodology and specic emotion evocation methods: from the production, selection and description of stimuli and
experimental procedures to the results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7.1.
Production of initial set of stimuli . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7.2.
Selection of stimuli . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7.3.
Description of stimuli . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7.4.
Participants . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7.5.
Procedures . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7.6.
Data analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7.7.
Results . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7.7.1.
Changes in emotional states . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7.7.2.
ERD/ERS analysis: power spectral analysis . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Corresponding author.
E-mail address: 2steps@charter.net (M.Y.V. Bekkedal).
0149-7634/$ see front matter 2011 Elsevier Ltd. All rights reserved.
doi:10.1016/j.neubiorev.2011.05.001
1960
1960
1960
1961
1961
1961
1962
1962
1963
1965
1965
1965
1965
1965
1965
1965
1966
1966
1966
1960
8.
9.
7.7.3.
ERD/ERS analysistheta (3.57/5 Hz) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7.7.4.
ERD/ERS analysisalpha (812 Hz) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7.7.5.
ERD/ERS analysisbeta (1330 Hz) . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
7.8.
Discussion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Theoretical implications for future work . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Implications for the eventual understanding of affective consciousness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Acknowledgements . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
References . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1. Introduction
An understanding of the underlying brain dynamics that generate emotional states, especially those that mediate cortical affective
information- and state-processing remains a chaotic area of neuroscience. If one were able to identify emotional feelings and related
cognitive processing of associated stimuli from direct cortical neural recordings, it would have broad implications including the
potential to be applied for improving our approaches for diagnosing and treating neuropsychiatric disorders with symptoms that
include dysfunctional processing of emotional information, such
as autism, depression, mania, and schizophrenia. The aspiration
to perfect such neurometric measures has a long history (e.g.,
John et al., 1988), albeit without any explicit consideration of the
underlying emotional issues.
If we could obtain knowledge about the specicity of EEG
response patterns in the brain related to primary-process emotions, such information may serve as a standard for more accurately
identifying atypical brain activities. The emergence of the eld of
affective neuroscience has opened opportunities for applying scientic rigor and objectivity to the study of the neural substrates
of specic human emotions (Panksepp, 1998), but so far only at
a subcortical level, which is difcult to extend to human studies
except at homologous neurochemical levels (Panksepp, 1986, 1998,
2004). The present paper summarizes the state of the eld from an
affective neuroscience point of view, rather than an informationprocessing cognitive perspective, and summarizes some of our
human EEG work that sought to illuminate such issues.
2. Electrophysiological correlates of brain affective
processes
Basic emotional feelings reect evolutionarily ancient neurodynamics that have yet to be reliably measured from either human
or animal brains. In human research, emotional states are estimated largely from subjective self-reports, and in animals from
ethologically distinct emotional behavior patterns that arise from
neural circuits with distinct affective qualities (Panksepp, 1998).
Part of the problem is that emotional feelings probably reect
large-scale brain network dynamics or emotional action systems
(Panksepp and Bivin, 2011), for which we simply do not yet have
good tools that can directly monitor the relevant subcortical circuit activities (Panksepp, 2000). We do have various impressive
indirect measures, especially fMRI (functional Magnetic Resonance Imaging) and PET (Positron Emission Tomography) that can
estimate energy consumption changes in the brain, albeit those
statistically extracted indirect measures only reect a very small
fraction of ongoing brain activity, with most of it remaining as if
it were background dark energy that does not serve immediate
information-processing capacities, but rather the intrinsic needs of
the brain (Raichle, 2010a,b). It is thought that major problems with
brain functions, as evidenced by various neurological and psychiatric disorders may arise from imbalances of this dark energy of
the brain (Zhang and Raichle, 2010). In any event, such measures
that can identify brain loci that participate in various psycho-
1966
1967
1967
1967
1967
1968
1969
1969
System (BAS) (Bijttebier et al., 2009). Focus on these systems provides a dichotomy for dening emotions based on whether they
are aversive and lead to avoidance behaviors, or are appetitive and
result in approach behaviors. At rst this may appear to simply
be a more rened classication of positive versus negative affect.
However, this perspective considers more than just the emotional
experience but also the corresponding behaviors emitted during
those experiences, as well as overall personality style (Corr, 2004).
While there is some utility in this classication scheme, recent
evidence challenges the clear association of the BIS with negative
emotions and the BAS with positive emotions. The issue centers
on the emotions of fear and anger, both of which have traditionally been classied as negative. While fear seemingly ts with the
categorization of negative emotions associated with activation of
the BIS and resulting defensive behaviors, anger can also involve
appetitive arousal and approach behaviors associated with BAS
activation (Balconi and Mazza, 2009; Carver and Harmon-Jones,
2009). Indeed, EEG response patterns support the later characterization. Balconi and Pozzoli (2009) report changes in activation of
the frontal lobes during pictures of anger that more closely match
what one might expect from activation of the BAS. In the same
report, responses for fear were consistent with those activating the
BIS. Thus, it would seem feasible that although the two emotions
resemble each other on both arousal and valence dimensions, their
activation of underlying behavioral response circuits could result
in differential patterns of activity at the cortical level.
Although there are some studies (e.g., Balconi et al., 2009a,b)
where the research design is conducive to the differentiation of
individual emotions, there is far more research that utilizes the
broader positive/negative categorizations. As such, most studies
include the examination of only three stimulus types: one of
negative affect, one of positive affect, and an emotionally neutral stimulus. Given the paucity of research that makes direct
comparisons within a single affective valence category, the question remains whether or not individual emotions, especially those
revealed by localized subcortical stimulations (Panksepp, 1998)
can be delineated from one another based on neurophysiological
response patterns.
1961
these types of studies have been either visual (Aftanas et al., 2001;
Aftanas et al., 2002; Balconi and Lucchiari, 2006; Balconi and Mazza,
2009; Balconi and Pozzoli, 2009; Balconi et al., 2009a,b; Knyazev
et al., 2008; Knyazev et al., 2009, 2010) or auditory (Paulmann
and Kotz, 2008; Spreckelmeyer et al. (2009). We suspect that the
use of visual stimuli adds an element of irrelevant stimulus complexity that might be minimized with the use of auditory stimuli
such as short human, primate-like vocalizations of various distinct
emotions.
While it is recognized that detection and interpretation of affect
in the human voice is a complex issue (Banse and Scherer, 1996;
Zinken et al., 2008), the use of stimuli with short duration and free
of linguistic content can minimize the non-affective components
of stimuli (i.e., noise) and provide a more pure representation of
emotion. Nonlinguistic components of vocal information play a
signicant role in communicating affective information (Scherer,
1994). Research shows humans can accurately identify emotional
meaning from prosodic features, and this differentiation is for specic emotions, not simply positive versus negative valences, and
not based solely on levels of arousal (Banse and Scherer, 1996).
Thus, such stimuli might be optimal for use in attempts to assess differential brain response patterns during the perception of emotion.
5. Measuring affective communication in the brain
If one aims to monitor brain activity during emotional states and
make comparisons across different states, it is reasonable to employ
measurement tools with relatively low temporal resolution, since
emotions are sustained responses of the brain. In other words, it
is not imperative that immediate responses be measured, or that
assessments involve small (fractions of a second) increments of
time. To assess the brains immediate responses to affective stimuli,
a measurement tool that actually measures neural activity in real
time is desirable, compared to indirect measures as with fMRI and
PET. For routine human research, such instruments are limited to
the electroencephalogram (EEG) and the magnetoencephalogram
(MEG).
The disadvantage of the EEG is its poor spatial resolution at the
cortical level, and the difculty of extrapolating from the cortical activity to what is happening among subcortical processes that
more directly generate affective states. MEG addresses these issues
to some extent, however, it has been used in only a small number
of studies related to the perception of emotion. In two studies using
repetitive transcranial magnetic stimulation to differentiate brain
responses related to prosody, brain responses to two negative emotions (fear and sadness) did not signicantly differ from one another
(Hoekert et al., 2008, 2010). Similar results from a third study (van
Rijn et al., 2005) have been reported, however, response patterns
for sadness and fear differed from those for an emotion of the same
valence (anger) as well as for responses to happiness
In contrast to the small number of studies using MEG, the EEG
has been used much more extensively. Electroencephalography is a
relatively non-invasive approach for measuring brain responses to
discrete stimuli. Although a comparatively old technique for studying brain activity, advances in electronics and computer technology
have allowed increasingly innovative approaches to analyzing
patterns of electrical activity measured in an EEG. Approaches
for evaluating changes in such electrical brain activity fall into
three main categories: (i) overall spectral power changes, (ii)
event-related potentials (ERP), and iii) event-related desynchronizations/synchronizations (ERD/ERS).
5.1. Spectral power change
One traditional approach has been to evaluate the overall power
within a given frequency band under different affect conditions.
1962
tion occurs in the implicit testing condition, and a later theta ERS
is present during the explicit identication of emotions in pictures of facial expressions. It was proposed that these reect an
unconscious versus conscious processing of the emotion with the
implicit/passive condition being the one of interest for our effort to
identify instinctual affective responses.
Unfortunately, as with other similar efforts using the EEG, this
line of research has failed to clearly demonstrate distinct response
patterns for individual emotions. A commonality between all of
these studies of affect-related ERD/ERS is their universal use of
visual stimuli to communicate emotion. More specically, they
used pictures from the standardized International Affective Picture
System (Center for the Study of Emotion and Attention, 2001), or
pictures of facial expressions from the Ekman and Friesen (1976)
set. As already noted, all visual stimuli tend to have information that
is irrelevant to the emotionality of the pictures used. Thus, we propose that specic emotion-related brain activity changes may be
better detected when the stimuli are presented through the auditory modality. Short, nonlinguistic vocal sounds of emotion may be
less complex than visual portrayals, which require visual scanning
and interpretation of the stimuli, and thus may reduce confounding variables introduced by the more complex cognitive processing
necessary in interpretation of emotional signicance. Furthermore,
evidence suggests there is a difference in a persons accurate classication of an emotion that is dependent on whether the emotion is
communicated in a facial expression versus nonlinguistic vocalization (Hawk et al., 2009). Accordingly, after summarizing additional
EEG approaches to emotion studies, we will proceed to present
results from a project using human vocalizations, that provided an
opportunity to evaluate consistencies in cortical neural response
patterns related to affect independent of confounding sources of
variability as are routinely present in the use of visual sensory input
to the brain.
1963
1964
Fig. 1. Cortical regional differences in EEG alpha power shifts (within the full 813 Hz range) monitored from the cranial surface of one subject (Jaak Panksepp) during the rst
4 s periods after successive emotional inductions using Clyness Sentic Cycles procedure (see Clynes (1977) as well as Panksepp and Clynes (1998)). Each emotional dynamic
was repeated 20 times, using the timing recommendations for the Sentic Cycles exercise. The sequence of emotions, was left to right: (i) neutral expression, (ii) anger, (iii)
sadness, (iv) love/lust, (v) joy and (vi) reverence. Four successive 1-s epochs (top to bottom) are depicted following initiation of the experiential emotional dynamic, assisted
by bilateral mild dynamic pressure between index nger and thumb, in seated meditative position. A successive quarter second analysis (as a pseudo-video) of each 4 s period,
broken down into 2 Hz analysis is available on the web site provide below. It was remarkable how dramatically power shifts in these fractional alpha-frequency ranges differed
from each other, highlighting the complexity that has to be faced when doing such analyzes (see http://www.vetmed.wsu.edu/research vcapp/Panksepp-endowed.aspx).
systems (Panksepp, 2000). Yet, the EEG has relatively poor spatial
resolution at the cortical level for picking up discrete network activities, and at best, one can only infer localized subcortical activity
changes, even though that is where most of the affective action
may be occurring (Panksepp, 1998).
Obviously, electrical brain activity measurement at the subcortical level is invasive, and thus not practical for routine human
research. It is possible to attempt parallel types of investigations
using simultaneous cortical and subcortical recordings in animal
models, however, the ability to clearly delineate discrete emotional
stimuli, affective states and corresponding emotional responses in
these models, in time-locked ways, remains a challenge. We do suspect that the use of emotional vocalizations, as in the experiment to
be described, is the optimal way to proceed, since the same methods
could be used in both human and animal research. In anticipation
of such possibilities, in the following experiment we attempted to
rst apply this strategy to the human brain, to determine if distinct
emotion-specic signatures of brain changes could be obtained
that could then be translated to animal models for ner analyses.
Of course, other brain imaging techniques, such as the fMRI
and PET, have much greater spatial resolution and allow for the
visualization of simultaneous activity throughout the entire brain.
However, their relatively poor temporal resolution makes them
less ideal as a tool for measuring the rapid onset, short duration
responses to affective communications. Additionally, their routine
use in cross-species affective neuroscience research is not yet readily available.
Before conceding to the possibility that the EEG cannot be used
to delineate various discrete primary-process emotional responses,
we proceeded to design the simplest possible experiment we could
1965
The subjects were tted with a standard recording cap (ElectroCap International) with 19 electrodes systematically placed
according to the 10/20 system (Jasper, 1958) with linked earlobes
as the reference. Each electrode was lled with electrically conductive gel (Electro-gel, Electro-Cap International), and the scalp was
abraded until impedances at each electrode were <10 k. Subjects
were tted with a pair of monaural headphones placed over the cap,
and were seated in a comfortable chair located inside a dimly lit,
sound-attenuated audiology testing booth. They were instructed to
remain seated with eyes closed and relax in order to minimize the
effects of muscle movement.
Each session began with 3 min of silence for a baseline recording,
followed by the presentation of ve different pure tones repeated
45 times each throughout the next 3 min with an interstimulus
interval of 10 s. Then one group of emotion sounds was presented
such that each of the ve different vocalizations for that emotion set
was presented 45 times during the next 3 min with an interstimulus interval of 10 s. This was followed by another 2 min presentation
of the pure tones. Following the second pure tone set, data recording was paused while the subject was asked to verbally label the
emotion represented in the majority of the most recent presentation of vocalizations, and provide an average intensity rating for
those sounds. This pattern of presentation was repeated for the
remaining three emotion subsets. The order of emotion subsets was
arbitrarily chosen for each subject such that a set of positive vocalizations was always followed by a subset of negative vocalizations
or vice versa. Following the recording session, subjects were asked
to describe any changes in their emotions that may have occurred
during testing by identifying the emotion experienced and rating its
intensity. These questions were meant to determine if the sounds
evoked a lasting change in the participants emotional state.
7.6. Data analysis
Muscle artifacts in the EEG records were manually identied and
removed from further analysis. Data were further reduced with a
150 Hz lter to eliminate all frequencies outside of that range.
The data were Fast Fourier transformed, and power spectra were
calculated for each of the 3 min presentations of different emotionrelated stimuli. Values were obtained for theta (3.57.5 Hz), alpha
(812 Hz) and beta (1330 Hz). Second, ERD/ERS analysis was
employed to determine stimulus-related changes in the power of
the same three frequency ranges. The percent change in each frequency band was calculated as (R A/R) 100), where R and A are
the EEG power of the specied frequency in the reference and test
intervals, respectively (Pfurtscheller and Aranibar, 1977). In this
study, the reference interval was identied as occurring 1 s prior to
the stimulus onset and the test intervals 01 s from the onset of the
stimulus. The averaging sweep was 0.5 s such that the test interval
was divided into two equal time increments, 00.5 s and 0.51 s. As
nearly all responses had returned to baseline by the 0.51 s interval,
only the rst epoch (00.5 s) was included for further analysis.
For the purpose of a global analysis, data from individual electrodes were averaged together such that the scalp was divided into
quadrants and three electrodes from each quadrant were averaged.
Specically, scores for the right anterior quadrant included the Fp2,
F4 and F8 electrode sites. Sites Fp1, F3 and F7 were used for the left
anterior. Electrodes P4, T6 and O2 made up the right posterior and
% of Reference Theta
Power
120
110
100
90
% of Reference Theta
Power
Anterior Left
130
Neutral
Joy
Pleasure Anger
Emotion Type
130
Posterior Left
120
110
100
90
Neutral
Joy
Pleasure Anger
Emotion Type
Sad
Anterior Right
130
120
110
100
90
Sad
% of Reference Theta
Power
% of Reference Theta
Power
1966
Neutral
Joy
Pleasure Anger
Emotion Type
Sad
130
Posterior Right
120
110
100
90
Neutral
Joy
Pleasure Anger
Emotion Type
Sad
Fig. 2. Changes in theta power based on percent of baseline power in response to different stimuli. Results are averaged across electrodes on each of four quadrants of the
scalp. The gure demonstrates the greatest theta synchronization in the anterior hemisphere and in response to sounds of anger and pleasure. Filled bars are for males and
open bars are for females.
% of Reference Theta
Power
7.7. Results
showed less of a theta power increase than males. This effect was
statistically reliable in the right anterior region only.
Some differences were found in response to emotional sounds
within genders (Fig. 3). In males, moderated ERDs occurred during
joy and anger sounds. This change was signicantly different form
the ERS observed during pleasure sounds (ps < .05). Responses to
neutral sounds were modest ERSs that did not differ from responses
to any of the other sounds. In females, all responses were theta
ERSs. The ERSs for joy and pleasure were reliable smaller than
those observed during anger sounds (ps < .05). As in males, female
responses to neutral and sad sounds did not reliable differ from the
brain changes during the other emotional sounds.
A 2-way interaction was found for brain regions time
(ps < .0001) where theta power increased in the 0.00.5 epoch
% of Reference Theta
Power
115
Males
110
105
100
95
Neutral
Joy
90
Pleasure
Anger
Sad
Anger
Sad
Emotion Type
115
Females
110
105
100
95
Neutral
90
Joy
Pleasure
Emotion Type
and decreased in the 0.51.0 epoch. This pattern was the same
in all brain regions, however, the magnitude of the changes was
greater in the anterior regions in comparison to the posterior ones
(p < .005). These differences were not observed during the 0.51.0
epoch. The statistical main effect for time (p < .0001) was due to
an increase in theta during the 0.005 interval (112.3 0.6%)
and a subsequent decrease during the 0.51.0 interval
(95.2 0.8%).
7.7.4. ERD/ERS analysisalpha (812 Hz)
Changes in the alpha band did not differentiate the emotional
stimuli. This is important, for alpha-blocking (i.e., ERD) types of
experiments have been typically successful using cognitive, nonemotional stimuli (see especially the cited work by Wolfgang
Klimeschs group). Still, some small effects were also observed
here. Modest differences were observed between the anterior and
posterior regions of the brain, where overall there were greater
reductions in alpha power in the posterior regions than the anterior
ones [brain region time (p < .05)]. Differences were found during
the 00.5 epoch where alpha desynchronization was less in the
left anterior region than in either the left or right posterior regions
(ps < .05). During this epoch, the decrease in alpha was less in the
right anterior region than the left (p < .05), but was not different
from the right posterior region (p < .10). There were no regional differences during the second half of the recording interval (0.51.0).
7.7.5. ERD/ERS analysisbeta (1330 Hz)
Similar to changes in the alpha band, response patterns in
the beta wavelength did not distinguish between the emotional
stimuli. No signicant left-right hemispheric differences were
found for either the anterior or posterior regions. A main effect
for brain regions was observed, reected by slight decreases in
beta power in the left and right anterior regions (99.7 0.6% and
99.5 0.5%, respectively) accompanied by modest increases of
beta power in the left and right posterior regions (101.8 0.5%
and 100.9 0.4%, respectively). The differences were statistically
reliable only between the left and right anterior regions and left
posterior region (ps < .01)
7.8. Discussion
The present experiment evaluated the use of ERD/ERS analysis in
differentiation of brain responses to nonlinguistic vocalizations of
joy, pleasure, anger and sadness. This expanded previous research
that has traditionally used visual stimuli including images of emotional situations (Aftanas et al., 2001, 2002) and pictures of affective
facial expressions (Balconi and Lucchiari, 2006). The results provided limited support for our hope that theta ERS patterns would
differentiate responses to distinct emotion-related sounds. The
results were consistent with previous research where the most
salient effects were observed in theta ERSs in anterior regions of the
brain (Aftanas et al., 2001, 2002; Balconi and Lucchiari, 2006). However, in prior studies these changes were not dependent on emotion
type but rather were more generically related to emotion versus
non-emotion related stimuli. Even so, an unanticipated and novel
nding of gender differences was observed that was based on the
differentiation of responses related to specic emotions. It may be
an important consideration for future research to carefully evaluate gender-specic responses. It may also be valuable to categorize
stimuli based on specic emotions as opposed to broader categorizations of neutral versus emotion-related (Balconi and Lucchiari,
2006), positive versus negative valence (Aftanas et al., 2001) or
differentiation based solely on the rated arousal levels of the emotional stimulus (Aftanas et al., 2002).
In females, the anger stimuli provoked the largest EEG
response. This is consistent with a previous report where women
1967
viewing lm clips showed greatest theta increases during segments with strong aggression-related content as compared to an
emotionally neutral clip or segment portraying sadness (Krause
et al., 2000). Additionally, functional magnetic resonance imaging
changes show that womens brain activation patterns in response
to facial expressions of anger versus fear are more discriminating than responses in men. This gender difference is not apparent
in adolescence suggesting a culturally learned effect where adult
women have learned to attend to indications of anger and a
potentially threatening situation, and corresponding emotional
stimuli cause a characteristic defensive response (McClure et al.,
2004).
Although theta ERS does correlate with the appraisal of the
emotion-related sounds, the psychological corollary of theta activity remains unclear. Previous ndings related to theta ERS suggest
its relationship to working memory (see reviews by Klimesch,
1996; Klimesch et al., 1996) and episodic memory (Doppelmayr
et al., 1998; Klimesch et al., 1994, 1997a,b). Last, overall theta
power in the frontal regions of the brain is directly related to
the amplitude of evoked potential response in the same areas
(Basar et al., 1998). Greater evoked potentials are traditionally
interpreted as elevated levels of attention directed towards a
specic stimulus. Such evidence may suggest the theta differences
during emotional sounds, as found here between women and men,
may reect differences in allocation of attention. However, it is
not possible from the present data to determine conclusively that
women paid more attention to the anger sounds than to others or
gave them more attention than males did.
While it is possible women were more psychologically alerted
by the sounds of anger than the men were, it seems unlikely the
differences occurred because they were more distressed or experienced some other more lasting change in their emotional state.
Theta ERS was consistently found during the earliest part of the
stimulus presentation (rst half second), but had returned virtually to baseline by the next half second. Presumably a more global
change related to variations in emotional states would have lasted
longer than this fraction of a second. Also, there did not appear
to be any residual effects from the sounds since there were no
general changes in the overall power spectra across the stimulus
types for any of the frequency bands evaluated. Thus, results from
the spectral power analyses suggest a relatively constant baseline
state of brain activity, and likely a constant affective state in the
test subjects.
Although the results show consistent theta ERSs, there are only
modest differences between responses related to different discrete emotional sounds used, and in some cases changes related
to neutral tones were equal to or greater in magnitude than
changes during emotional vocalizations. It is possible that such
changes reect the commonly observed see-saw dynamics of cortical and subcortical activities seen in human brain imaging (Liotti
and Panksepp, 2004). As such, it does not appear that theta ERS
changes are sufciently sensitive to delineate distinct brain activity responses related to stimuli representing different emotions.
However, these data, combined with those from research evaluating visual depictions of emotion, suggest theta ERS may be a more
sensitive marker for the arousing qualities of emotional stimuli and
less differentiating for emotional valence. In conclusion, although
theta ERS patterns consistently appear in response to emotionrelated stimuli, they are insufcient for clearly dening typical from
atypical response patterns.
1968
1969
emotional states in animals both on the input side, as with emotional vocalizations, and on the output side using appropriate
subcortical stimulations. This type of strategy may be able to reveal
the causal/constitutive primary-process infrastructure of human
and other mammalian minds, which may progressively lead to
similar probing of the more diverse, higher-order evolutionarily
emergent levels of cognitive mental processes that take hold mostly
through experience-dependent cortical maturation, and perhaps
partly through higher neural evolutionary functions that currently
remain largely unknown. If the dynamic signatures of emotionally felt states of mind could be empirically measured, we expect
that it would lead to a better understanding of our deep affective
nature, which may be fundamental for understanding psychiatric
disorders.
Acknowledgements
Funding for this research was provided to the Department of
Psychology, Bowling Green State University, Bowling Green, OH
43403, USA from the Ofce of Naval Research (F3360197MT040) to
JR. During the construction of this manuscript, J.P. was supported
by funding by the Hope for Depression Research Foundation.
References
Aftanas, L.I., Varlamov, A.A., Pavlov, S.V., Makhnev, V.P., Reva, N.V., 2001. Affective
picture processing: event-related synchronization within individually dened
human theta band is modulated by valence dimension. Neurosci. Lett. 303,
115118.
Aftanas, L.I., Varlamov, A.A., Pavlov, S.V., Makhnev, V.P., Reva, N.V., 2002. Timedependent cortical asymmetries induced by emotional arousal: EEG analysis
of event-related synchronization and desynchronization in individually dened
frequency bands. Int. J. Psychophysiol. 44, 6782.
Aftanas, L.I., Varlamov, A.A., Reva, N.V., Pavlov, S.V., 2003. Disruption of early eventrelated theta synchronization of human EEG in alexithymics viewing affective
pictures. Neurosci. Lett. 340 (1), 5760.
Andersen, S.B., Moore, R.A., Venables, L., Corr, P.J., 2009. Electrophysiological correlates of anxious rumination. Int. J. Psychophysiol. 71, 156169.
Balconi, M., Lucchiari, C., 2006. EEG correlates (event-related desynchronization) of
emotional face elaboration: a temporal analysis. Neurosci. Lett. 392, 118123.
Balconi, M., Bramgilla, E., Falgo, L., 2009a. Appetitive vs. defensive responses to emotional cures. Autonomic measures and brain oscillation modulation. Brain Res.
1296, 7284.
Balconi, M., Bramgilla, E., Falgo, L., 2009b. BIS/BAS, cortical oscillations and coherence
in response to emotional cues. Brain Res. Bull. 80, 151157.
Balconi, M., Mazza, G., 2009. Brain oscillations and BIS/BAS (behavioral inhibition/activation system) effects on processing masked emotional cues.
ERS/ERD and coherence measures of alpha band. Int. J. Psychophysiol. 74,
158165.
Balconi, M., Pozzoli, U., 2009. Arousal effect on emotional face comprehension frequency band changes in different time intervals. Physiol. Behav. 97, 455462.
Banse, R., Scherer, K.R., 1996. Acoustic proles in vocal emotion expression. J. Pers.
Soc. Psychol. 70, 614636.
Basar, E., Rahn, E., Demiralp, T., Schrmann, M., 1998. Spontaneous EEG theta activity
controls frontal visual evoked potential amplitudes. Electroencephalogr. Clin.
Neurophysiol. 108 (2), 101109.
Bijttebier, P., Beck, I., Claes, L., Vandereycken, W., 2009. Grays Reinforcement Sensitivity Theory as a framework for research on personalitypsychopathology
associations. Clin. Psychol. Rev. 29, 421430.
Bradley, M.M., Lang, P.J., 2007. The International Affective Picture System (IAPS) in
the study of emotion and attention. In: Coan, J.A., Allen, J.J.B. (Eds.), Handbook
of Emotion Elicitation and Assessment. Cambridge University Press, New York,
pp. 2946.
Carver, C.S., Harmon-Jones, E., 2009. Anger is an approach-related affect: evidence
and implications. Psychol. Bull. 135, 183204.
Center for the Study of Emotion and Attention, 2001. The International Affective
Picture System: Digitized Photographs. The Center for Research in Psychophysiology, University of Florida, Gainesville, FL.
Cohen, M.X., Elger, C.E., Ranganath, C., 2007. Reward expectation modulates
feedback-related negativity and EEG spectra. Neuroimage 35, 968978.
Corr, P.J., 2004. Reinforcement sensitivity theory and personality. Neurosci. Biobehav. Rev. 28, 317332.
Damasio, A.R., Grabowski, T.J., Bechara, A., Damasio, H., Ponto, L.L.B., Parvizi, J.,
Hichwa, R.D., 2000. Subcortical and cortical brain activity during the feeling of
self-generated emotions. Nat. Neurosci. 3, 10491056.
Davidson, R.J., 1992. Anterior cerebral asymmetry and the nature of emotion. Brain
Cogn. 20 (1), 125151.
1970
Davis, K.L., Panksepp, J., Normansell, L., 2003. The affective neuroscience personality
scales: normative data and implications. Neuro-Psychoanalysis 5, 2129.
Doppelmayr, M., Klimesch, W., Schwaiger, J., Auinger, P., Winkler, T., 1998. Theta
synchronization in the human EEG and episodic retrieval. Neurosci. Lett. 257
(1), 4144.
Ekman, P., Friesen, W.V., 1976. Pictures of Facial Affect Consulting. Psychologist
Press, Palo Alto.
Gordon, J.A., 2011. Oscillations and hippocampal-prefrontal synchrony. Curr. Opin.
Neurobiol. April 4 [Epub ahead of print].
Harmon-Jones, E., 2007. Asymmetrical frontal cortical activity, affective valence,
and motivational direction. In: Harmon-Jones, E., Winkielman, P. (Eds.), Social
Neuroscience. Guilford Press, New York, pp. 137156.
Hawk, S.T., van Kleef, G.A., Fischer, A.H., van der Schalk, J., 2009. Worth a thousand
words: absolute and relative decoding of nonlinguistic affective vocalizations.
Emotion 9, 293305.
Hoekert, M., Bais, L., Kahn, R.S., Aleman, A., 2008. Time course of the involvement of
the right anterior superior temporal gyrus and the right fronto-parietal operculum in emotional prosody perception. PLoS ONE 3 (5), 17.
Hoekert, M., Vingerhoets, G., Aleman, A., 2010. Results of a pilot study on the involvement of bilateral inferior frontal gyri in emotional prosody perception: an rTMS
study. BMC Neurosci. 11, 93.
Jasper, H.H., 1958. The ten-twenty electrode system of the international federation.
Electroencephalogr. Clin. Neurophysiol. 10, 371375.
Jausovec, N., Jausovec, K., Gerlic, 2001. Differences in event-related and induced EEG
patterns in the theta and alpha frequency bands related to human emotional
intelligence. Neurosci. Lett. 311, 9396.
John, E.R., Prichep, L.S., Fridman, J., Easton, P., 1988. Neurometrics: computerassisted differential diagnosis of brain dysfunctions. Science 239, 162169.
Kamarajan, C., Rangaswamy, M., Chorlian, D.B., Manz, N., Tang, T., Pandey, A.K.,
Roopesh, B.N., Stimus, A.T., Porjesz, B., 2008. Theta oscillations during the processing of monetary loss and gain: a perspective on gender and impulsivity.
Brain Res. 1235, 4562.
Klimesch, W., 1996. Memory processes, brain oscillations and EEG synchronization.
Int. J. Psychophysiol. 24, 61100.
Klimesch, W., Doppelmayr, M., Russegger, H., Pachinger, T., 1996. Theta band power
in the human scalp EEG and the encoding of new information. Neuroreport 7,
12351240.
Klimesch, W., Doppelmayr, M., Pachinger, T., Ripper, B., 1997a. Brain oscillations and
human memory: EEG correlates in the upper alpha and theta band. Neurosci.
Lett. 238 (12), 912.
Klimesch, W., Doppelmayr, M., Schimke, H., Ripper, B., 1997b. Theta synchronization and alpha desynchronization in a memory task. Psychophysiology 34, 169
176.
Klimesch, W., Schimke, H., Schwaiger, J., 1994. Episodic and semantic memory: an
analysis in the EEG theta and alpha band. Electroencephalogr. Clin. Neurophysiol. 91 (6), 428441.
Knyazev, G.G., 2007. Motivation, emotion, and their inhibitory control mirrored in
brain oscillations. Neurosci. Biobehav. Rev. 31, 377395.
Knyazev, G.G., Bocharov, A.V., Levin, E.A., Savostyanov, A.N., Slobodskoj-Plusnin, J.Y.,
2008. Anxiety and oscillatory responses to emotional facial expressions. Brain
Res. 1227, 174188.
Knyazev, G.G., Slobodskoj-Plusnin, J.Y., Bocharov, A.V., 2009. Event-related delta and
theta synchronization during explicit and implicit emotion processing. Neuroscience 164, 15881600.
Knyazev, G.G., Slobodskoj-Plusnin, J.Y., Bocharov, A.V., 2010. Gender differences in
implicit and explicit processing of emotional facial expressions as revealed by
event-related theta synchronization. Emotion 10 (5), 678687.
Krause, C.M., Viemer, A., Rosenqvist, L., Sillanmki, T., strm, T., 2000. Relative
electroencephalographic desynchronization and synchronization in humans to
emotional lm content: an analysis of the 46, 68, 810 and 1012 Hz frequency bands. Neurosci. Lett. 286 (1), 912.
Liotti, M., Panksepp, J., 2004. On the neural nature of human emotions and implications for biological psychiatry. In: Panksepp, J. (Ed.), Textbook of Biological
Psychiatry. Wiley, New York, pp. 3374.
McClure, E.B., Monk, C.S., Nelson, E.E., Zarahn, E., Leibenluft, E., Bilder, R.M., Charney,
D.S., Ernst, M., Pine, D.S., 2004. A developmental examination of gender differences in brain engagement during evaluation of threat. Biol. Psychiatry 55 (11),
10471055.
Miller, R., 1991. Cortico-Hippocampal Interplay and the Representation of Contexts
in the Brain. Springer, Berlin-Heidelberg, New York.
Mitchell, D.J., McNaughton, N., Flanagan, D., Kirk, I.J., 2008. Frontal-midline theta
from the perspective of hippocampal theta. Prog. Neurobiol. 86, 156185.