Você está na página 1de 50

1

Human Social Interaction


Research proposal

Dr. Roger Newport


Room B47

Drop-in times: Tuesdays 12-2

www.psychology.nottingham.ac.uk/staff/rwn

Understanding Emotion: visual recognition


2
Introduction to facial emotions
The neuroscience of Fear and Disgust
(the simple story)
Other emotions (the complicated story)
Current research
Lecture Overview
3
Understanding Emotions

What are facial expressions of


emotion and what are they for?

Are there specific centres in the


brain dedicated to emotion
perception?

Are different emotions processed


in different ways?
Why are we interested in emotion perception? 4

Evolutionary survival Social survival


facial expressions of emotion - what are the emotions? 5

Motivational Basic Self-conscious/social


Happiness Shame
Thirst Fear Embarrassment
Hunger Anger Pride
Pain Surprise Guilt
Mood Disgust
Sadness

Do not feature Feature prominently Regulate social


prominently in social behaviour
in social communication.
communication.
Faces are special 6

Face perception may be the


most developed visual
perceptual skill in humans.

Infants prefer to look at faces


from shortly after birth (Morton
and Johnson 1991).

Most people spend more time


looking at faces than at any
other type of object.

We seem to have the capacity


to perceive the unique identity
of a virtually unlimited number
of different faces
Understanding Emotion: from facial expressions
7
Facial expressions as a communicative tool
We laugh more if in a group/ show distress more if in a group

Babies (10 months) almost only smile in presence of caregiver

Babies look to caregiver and behave according to caregiver response


when encountering novel object. E.g. a barking dog or a snake

This is known as social referencing and is also seen in chimpanzee


societies

A similar process, observational fear, is seen in other monkeys. Infant


monkeys show fearful unconditioned response to mother’s expression of
fear when the mother could see a snake, but the infants could not. That
is, infants showed a fear response to the mother’s fear response.
facial expressions as communication 8

In
Erickson and
Schulkin,
2003

Percentage of facial responses to unpleasant odour classified as


unpleasant, neutral, or pleasant in a spontaneous condition, a posed
to real person condition, and a posed to imaginary audience
condition
facial expressions as communication 9

Facial expressions allow for rapid communication


They are produced when there is an emotional stimulus and an
audience present
Our interpretation of another’s emotion modulates our behaviour
and vice versa
The ability to recognise emotion expressions appears very early

first few days can distinguish between expressions of


(neonates) happiness, sadness, and surprise
Four- to six-month show preferences for facial expressions of
happiness over neutral and angry expressions

seven months can distinguish among expressions of fear,


anger, surprise, happiness, and sadness
Recognition as an automatic processes - fear and threat 10

Angry faces are detected much more Attention is driven by fear


rapidly than faces depicting non-
threatening expressions

Ohman et al., 2001


Automatic processes = dedicated network 11

Fear and the amygdala

Evidence from animal,


neuropsychological and
imaging studies suggest
that the amygdala is of
primary importance in
the recognition of fear.
Fear and the amygdala - evidence from animal studies 12

Bilateral amygdala removal:

reduces levels of aggression and fear in rats and monkeys

facial expressions and vocalisations become less expressive

impairs fear conditioning


Fear and the amygdala - evidence from human neuropsychology

Bilateral amygdala damage


reduces recognition of fear-inducing stimuli
reduces recognition of fear in others
reduces ability to express fear

Does NOT affect ability to recognise faces or


to know what fear is

See patients SM, DR and SE (Adolphs et al.


and Calder et al.)

Alzheimer’s disease impairs fear


conditioning

13
Fear and the amygdala - evidence from human imaging 14

Results from several studies


Fear face recog

Fear cond.

Increased amydala activity


for facial expressions of
fear vs. happiness, disgust,
anger, neutral

non-conscious
Subliminal activation
processing of fear
of amygdala to fear
expressions

Neuromodulatory role of left amygdala: less fear = less activity


A typical study 15
Amygdala Response to Facial Expressions in Children and Adults
Thomas et al., 2001

Blocks of fixation of
fear / neutral faces

No task, just watch

Right Left
Left amygdala activation for fear vs fixation
in male children and adults
Overall, adults showed greater amygdala
activation for fear v neutral whereas
children did not
(neutral faces may be ambiguous)
Methods and Materials
Subjects
Six male adults (mean 􏰅 24 years, SD 􏰅 6.6 years) and 12
children (mean 􏰅 11 years, SD 􏰅 2.4 years) recruited in the
Pittsburgh area were scanned in a 1.5-T scanner during passive
viewing of fearful and neutral faces. The children, sixfemale and
six male, ranged in pubertal development from Tanner stages1 I/I
to V/IV. Male and female subjects did not differ in mean age or
Tanner stage. Data from an additional three adults (three female)
and four children (two female) were not included due to excessive
motion artifact (􏰅0.5 voxels; n 􏰅 5) or claustrophobia (n 􏰅 1)
or because the subject fell asleep during the task (n 􏰅 1).
Subjects were screened for any personal or family history of
psychiatric or medical illness, and for any contraindications for
an MRI. Written child assent and parental consent were acquired
before the study.
Behavioral Paradigm
The task consisted of the rapid and successive presentation of faces in blocks
of neutral and emotional expressions. The face stimuli consisted of digitized
fearful and neutral faces taken from the Ekman and Friesen (1976) study
(Figure 1). A total of eight different actors (four male and four female)
demonstrating both fearful and neutral expressions were used. The hair was
stripped from the images to remove any nonfacial features, and both fear and
exaggerated fear poses were used for each actor (Calder et al 1997), resulting
in a total of 16 fear stimuli and eight neutral stimuli. Stimuli were presented
for 200 msec with an interstimulus interval of 800 msec (flashing fixation
point). Each block of trials consisted of the presentation of a flashing fixation
point for 45 sec followed by alternating 42-sec blocks of either neutral or
fearful expressions and a final 45-sec epoch of fixation (Figure 1). This
procedure was repeated in three runs of trials with the presentation order
counterbalanced across runs and across subjects (i.e., F-N-F-N-F or N-F-N-F-
N). Following Breiter and colleagues’ (Breiter et al 1996) design, no overt
response was required. Instead, subjects were instructed to fixate centrally to
try to get an overall sense of the faces.2
Image Acquisition, Processing, and Analysis
Scans were acquired on a 1.5-T GE Signa scanner (General Electric Systems, Milwaukee)
modified for echo planar imaging (Advanced NMR, Wilmington, MA) using a quadrature head
coil. A T1-weighted sagittal localizer image was used to prescribe the functional slice locations.
T1-weighted structural images were acquired in 4-mm contiguous coronal slices through the
whole brain (echo time [TE] min, repetition time [TR] 500, matrix 256 􏰅 256, field of view
[FOV] 20) for purposes of localizing the functional activity and aligning images in Talairach
space (Talairach and Tournoux 1988). Functional images (T2*) were acquired at 12 of these
slice locations spanning the entire amygdala (􏰅A20 to P24 in Talairach coordinates) using an
EPI BOLD sequence (TE 40, TR 3000, flip angle 90°, matrix 128 􏰅 64, FOV 20, 4-mm skip 0,
voxel size 3.125 􏰅 3.125 􏰅 4.0 mm). There were three runs of 100 images totaling 300 images
per slice. Images were motion corrected and normalized. All 18 subjects had less than 0.5
voxels of in-plane motion. All images were registered to a representative reference brain using
Automated Image Registration software (Woods et al 1992), and voxelwise analyses of variance
(ANOVAs) were conducted on these pooled data using normalized signal intensity as the
dependent variable (Braver et al 1997; Casey et al 2000). Separate analyses were conducted
comparing male adults and male children and comparing male and female children to examine
interactions of stimulus type (fearful faces, neutral faces, fixation) with age or gender,
respectively. Significant activations were defined by at least three contiguous voxels and alpha
= .05 (Forman et al 1995). Amygdala activation was defined on the reference brain using
Talairach coordinates and consensus among three raters (BJC, KMT, PJW). Significant regions
that extended outside of the brain or had large SDs were excluded.
Results
Adults and Children
A 2 􏰅 2 (Group 􏰅 Condition)3 ANOVA comparing male adults (n 􏰅 6) and male children (n
􏰅 6) revealed significant activity in the left amygdala and substantia innominata for fearful
faces relative to fixation (Figure 2) and a decrease in signal with repeated presentations of the
fearful faces4 (Table 1). Neutral faces showed a similar pattern of activation relative to fixation
trials (F 􏰅 23.71, p 􏰅 .001). A significant interaction was observed in the left amygdala
between stimulus type and age for the comparison of fearful and neutral expressions (Table 1)
(Group 􏰅 Condition, Fear vs. Neutral). Post hoc t tests indicate that adults demonstrated
significantly greater activity for fearful faces relative to neutral faces (p 􏰅 .001). However, the
children demonstrated greater amygdala activity for neutral faces than for fearful expressions
(p 􏰅 .0001) (Figure 3). Neither age nor Tanner stage predicted the magnitude of the percent
change in signal in this sample.
Warning about other brain regions

A variety of brain regions are


involved in the processing of facial
expressions of emotion

They are active at different times


and some structures are active at
more than one time

The amygdala is particularly


implicated in the processing of
fear stimuli receiving early
(<120 ms) subcortical as well
as late (~170 ms) cortical input
from the temporal lobes
16
Amygdala response to fear - special for faces? 17
The Amygdala Response to Emotional Stimuli:
A Comparison of Faces and Scenes
Hariri et al., 2002

Blocked design

Matching task

Getting rid of
unwanted
activations

Preferential right amygdala


response to faces (faces > IAPS)
Emotions - not just an ugly face 18

Hadjihhani and de Gelder, 2003 Adolphs and Tranel, 2003

Cs better when faces present


Bilat AMs worse when faces present
often better at negative stimuli without faces
Emotions -
the importance of
the eyes

19
20

break
21
The contribution of the eyes to facial expressions of fear
22

What emotion do these eyes depict?


Emotions - the importance of the eyes
23
Whalen et al., 2004

% signal
change
from fix.

The amygdala is responsive to large eye whites in fear (and


surprise) expressions
Amygdala activation above fixation baseline for non-
inverted (eye white) fearful eyes
Emotions - the importance of the eyes 24

The amygdala, fear and the eyes


Adolphs et al., 2005.
SM bubble analysis
Emotions - the importance of the eyes

SM’s eye fixation (or lack of it)

25
Emotions - the importance of the eyes 26

When told to look at the eyes


specifically SM improves, but
only while given this instruction
Emotions - amygdalae are not simply eye detectors 27
The amydalae are not just eye detectors - may direct attention to
relevant stimuli - a biological relevance detector

Some easy to tell from eyes

Others from mouths

We need more than just the


eyes to determine emotional
Hybrid Faces from
and social relevance
Vuilleumier, 2005.
Yuck! Disgust and the Insula (and basal ganglia) 29

Animal studies
insula = gustatory cortex
impaired taste aversion in rats

Human Neuropsychology
patient NK
Huntingdon’s Disease
Tourette’s and OCD
Electrical stimulation = nausea
Repeated exposure leads to
habituation

Human imaging
Philips et al.
Wicker et al.
Disgust - evidence from imaging 30
Both of Us Disgusted in My Insula: The Common Neural
Basis of Seeing and Feeling Disgust
Wicker et al., 2003
1. Observed actors smelling and reacting to bad, nice and neutral
odours
Separate visual and
2. Smelt bad and nice odours (+ rest) olfactory runs

Overlay analysis
Disgust vs. Fear summary 31

Fear Disgust

Amygdala Insula and Basal Ganglia

Activated by fear- Activated by facial expressions of


inducing stimuli disgust

Habituates to fear Insula Habituates to disgust

Removal or damage Removal, damage or


disproportionately degeneration of either structure
impairs fear recognition disproportionately impairs
and feelings of fear disgust recognition and feelings
of disgust.
A double dissociation: conclude that the neural mechanisms for
fear and disgust are anatomically and functionally distinct
Other basic emotions - implicated brain regions 32

Green = neutral
Red = anger
Purple = fear
Yellow - happy
Blue = sad

Duvernoy 1991, but see Kessler/West et al., 2001


Other basic emotions - implicated brain regions 33

Tissue loss
associated with
specific emotion
recognition
impairment

Rosen et al., 2006

50 patients with neurodegenerative dementia:


Negative emotions (red, rlITG/rMTG) and in particular sadness (green,
rSTG) correlated with tissue loss in right lateral inferior temporal and
right middle temporal regions.

Reflects this area’s role in visual processing of negative emotions


34

How does knowledge about brain activation help social psychologists?


Recent Research June 2006

Is emotion processing affected by advancing age?


An event-related brain potential study

Rationale

Age is related to decreasing cognitive function - esp in frontal functions

Emotional intensity is a frontal function

Are old folk impaired at emotion intensity recognition?

Methods
Investigated using ERP (EEG) and analysed by ANOVA
Results

young Delay in early


discrimination
processing, but
no difference in
emotion
discrimination
old
Recent Research January 2007
Perceiving fear in dynamic body expressions
Rationale

Emotional body language is important when the face cannot be seen

Important for survival so should be fast, automatic and employ dedicated


brain network

We know which parts of the brain are active for static emotion
We know that other parts of the brain are active for body motion

How do these interact for emotive whole body dynamic stimuli?


Methods

Used event-related fMRI and video clips

Fear and neutral body movements with scrambled static and dynamic
video images as controls

Task to press button


when inverted
image seen
(therefore
incidental
processing being
measured)
Analyses
1. Main effects of bodies vs. scrambled stimuli (Fs+Fd+Ns+Nd) −
2(Sd+Ss).
2. Main effects of fear vs. neutral bodies [(Fs+Fd)−(Ns+Nd)].
3. Main effects of dynamic vs. static bodies [(Fd+Nd)−(Fs+Ns)].

Results
Amygdala active for social stimuli

Not bothered whether static or dynamic

More bothered when it is fearful


Results

Other brain regions are bothered that it is dynamic (and fearful)


These regions will be covered in later lectures
Recent Research June 2007
Attention to the person or the emotion: Underlying activations in
MEG
Rational

Facial emotion processing is fast (100ms) and automatic and


occurs regardless of whether you attend to the face or not.

Facial identity is also fast (but slower) and occurs in parallel


according to most models

But there is some evidence from schizophrenia suggesting that the


parallel (and therefore separate) brain regions interact

What happens to this interaction when you attend to either emotion


or identity?
Methods and Results
Used MEG and happy/fear/neutral faces

Identity task - press button when 2 identities the same


Emotion task - press button when 2 emotions the same

90ms orbito-frontal response to emotion


regardless of attention

170ms right insula response when


attending to emotion

Conclusions
Also 220ms activation increase for areas So there you go
associated with identity processing
Recent Research Oct 2007

Impaired facial emotion recognition and reduced amygdalar volume


in schizophrenia
Rationale

Amygdala volume known to be reduced in Schizophrenics

Emotion recognition known to be impaired in Schizophrenia

Direct link between the two not studied (properly) before

Methods

Used 20 Sz + 20 Cs. 3T MRI


And facial emotion intensity recognition task
Results

(1) The schizophrenia patients had smaller amygdalar volumes than the
healthy controls;

(2) the patients showed impairment in recognizing facial emotions,


specifically anger, surprise, disgust, and sadness;

(3) the left amygdala volume reduction in these patients was associated
with impaired recognition of sadness in facial expressions.
Summary 37
distinct neural pathways underlie the processing of signals of fear
(amygdala) and disgust (insula/basal ganglia) in humans.

this dissociation can be related to the adaptive significance of these


emotions as responses to critical forms of threat that are associated
with external (fear) and internal (disgust) defence systems.

According to LeDoux, social neuroscience has been able to make


progress in the field of emotion by
focusing on a psychologically well-defined aspect of emotion
using an experimental approach to emotion that simplifies the
problem in such a way as to make it tractable
circumventing vague and poorly defined aspects of emotion
removing subjective experience as a roadblock to experimentation.
Next week
Emotion recognition from auditory cues and theories of emotion

Você também pode gostar