Você está na página 1de 6

See

discussions, stats, and author profiles for this publication at: https://www.researchgate.net/publication/8997476

Perceived Gaze Direction and the Processing of


Facial Displays of Emotion
Article in Psychological Science December 2003
DOI: 10.1046/j.0956-7976.2003.psci_1479.x Source: PubMed

CITATIONS

READS

310

223

2 authors, including:
Robert Kleck
Dartmouth College
78 PUBLICATIONS 4,674 CITATIONS
SEE PROFILE

All content following this page was uploaded by Robert Kleck on 03 December 2015.
The user has requested enhancement of the downloaded file. All in-text references underlined in blue are added to the original document
and are linked to publications on ResearchGate, letting you access and read them immediately.

Perceived Gaze Direction and the Processing of Facial Displays of Emotion


Author(s): Reginald B. Adams Jr. and Robert E. Kleck
Source: Psychological Science, Vol. 14, No. 6 (Nov., 2003), pp. 644-647
Published by: Sage Publications, Inc. on behalf of the Association for Psychological Science
Stable URL: http://www.jstor.org/stable/40063926
Accessed: 03-12-2015 15:16 UTC

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at http://www.jstor.org/page/
info/about/policies/terms.jsp
JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content
in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship.
For more information about JSTOR, please contact support@jstor.org.

Sage Publications, Inc. and Association for Psychological Science are collaborating with JSTOR to digitize, preserve and
extend access to Psychological Science.

http://www.jstor.org

This content downloaded from 129.170.195.67 on Thu, 03 Dec 2015 15:16:34 UTC
All use subject to JSTOR Terms and Conditions

PSYCHOLOGICAL
SCIENCE

Research Report
PERCEIVED GAZE DIRECTION AND THE PROCESSING OF
FACIAL DISPLAYS OF EMOTION
Reginald B. Adams, Jr., and Robert E. Kleck
DartmouthCollege

Abstract- Thereis good reason to believe that gaze directionandfacial displays of emotion share an informationvalue as signals of approach or avoidance. The combinationof these cues in the analysis of
social communication,however,has been a virtuallyneglected area of
inquiry.Two studies were conducted to test the prediction that direct
gaze would facilitate the processing of facially communicatedapproach-orientedemotions (e.g., anger andjoy), whereas averted gaze
wouldfacilitate the processing of facially communicatedavoidanceoriented emotions (e.g., fear and sadness). The results of both studies
confirmedthe central hypothesisand suggest that gaze direction and
facial expressionare combined in the processing of emotionally relevantfacial information.

ated with the behavioralmotivationsto approachor avoid (see Argyle


& Cook, 1976; Davidson & Hugdahl, 1995; Harmon-Jones& Segilman, 2001). Positive emotions, anger, and direct gaze, for example,
are associated with approach motivation. Negative emotions (other
than anger) and averted gaze are associated with avoidance motivation. Thus, as a signaling system, facial expressions of emotion and
gaze behavior may combine to signal these basic behavioraltendencies. Given the importanceof perceptionof gaze directionin the affective construalprocess, the perceptualprimacyof the eyes over other
facial cues, and the sharedsignal value of gaze directionand facial expressions of emotion (approach-avoidance),there is good reason to
believe that gaze direction might influence how efficiently facial expressionsof emotion are processedby perceivers.
Surprisingly,the interactionbetween gaze directionand facial expressionsof emotion in humanperceptionhas remaineda virtuallyuncharted area of inquiry in the analysis of social communication.
Ample evidence supportsthe contention that certainpatternsof gaze
behavior and facial expressions of emotion co-occur in both humans
(e.g., Argyle & Cook, 1976; Fehr& Exline, 1987) and nonhumanprimates (e.g., Hinde & Rowell, 1962; Redican, 1982) in a mannerconsistent with a shared underlying approach-avoidancesignal value.
Showing that behaviorsgenerally co-occur, however,does not establish their mutual effect on perceptualprocessing. In the currentresearch, we aimed to clarify the role of gaze direction in emotion
perception by investigating its influence on how efficiently emotion
informationis recognized in the face. Direct gaze was predictedto facilitate the processing of facially communicated approach-oriented
emotions (e.g., anger andjoy), whereas avertedgaze was predictedto
facilitate the processing of facially communicatedavoidance-oriented
emotions (e.g., fear and sadness).

An attractionto the eye region of the face may be innatelyprepared


(e.g., Baron-Cohen, 1997; Driver et al., 1999; Hess & Petrovich,
1987). Compellingcross-species evidence for this contentionis apparent in the eyespot configurationsthat appearon lower-orderanimals
such as birds, reptiles, fish, and insects (Argyle & Cook, 1976). These
spots are thoughtto mimic the eyes of largeranimals,therebywarding
off potential predators.In humans, both adults and infants prefer to
look at the eyes comparedwith other facial features (Janik,Wellens,
Goldberg,& Dell'Osso, 1978; Morton& Johnson, 1991), and are particularly sensitive to the gaze direction displayed by others (BaronCohen, 1997, Driveret al., 1999; Macrae,Hood, Milne, Rowe, & Mason, 2002).
The ability to detect the direction of gaze displayed by anotheris
believed to play a pivotal role in the developmentof person and affective construal processes (Baron-Cohen, 1997; Macrae et al., 2002;
Perrett& Emery, 1994). A particulardirection of gaze, however, can
convey multiple social meanings. Direct gaze in both humans and
nonhumanprimates,for example, can communicatethreat(Argyle &
STUDY 1
Cook, 1976; Redican, 1982) or friendliness(Kleinke, 1986; VanHoof,
1972). Thus, other contextualcues must be utilized when reading social meaning into the behavior of the eyes. Argyle and Cook (1976)
Method
have noted that "if the suitable experimentalsituations could be deParticipants
vised . . . [subjects]who are exposed to the same gaze, but with different context cues, would react differently,and evaluate the looker and
Thirteen male and 19 female undergraduatestudents were rethe looker's intentionsquite differently"(p. 96).
cruited.Participantsreceived $10 and came to the laboratoryin small
Facial expressions can offer critical contextual informationthat is
groupsof no more than6 people.
either consistent or inconsistent with the behavioral intentions communicatedby a specific gaze behavior.Fundamentally,the behavioral
Materials
intentions to approach and avoid drive biological behavior (e.g.,
Baron-Cohen, 1997; Brothers & Ring, 1992; Davidson & Hugdahl,
Pure expressions. Facial photographsof 15 male and 15 female
1995). Both gaze behaviorand emotion have been found to be associ- stimulus persons showing highly recognizable expressions of anger
and fear (referredto here as "pure"expressions) were presented.For
each stimulusperson, we includedone photographshowing a pureexAddresscorrespondence
to ReginaldB. Adams,Jr.,Department
of Psycho- pression of anger and one photographshowing a pure expression of
logicalandBrainSciences,Dartmouth
College,6207 MooreHall,Hanover, fear.These photographswere selected from the Picturesof FacialAfNH03755;e-mail:rba@alum.dartmouth.org.
fect (Ekman& Friesen, 1978), a set developed by Kirouacand Dore
644

2003American
Copyright
Psychological
Society

This content downloaded from 129.170.195.67 on Thu, 03 Dec 2015 15:16:34 UTC
All use subject to JSTOR Terms and Conditions

VOL.14,NO.6, NOVEMBER
2003

PSYCHOLOGICALSCIENCE

Reginald B. Adams, Jr., and Robert E. Kleck


(1984), the Montreal Set of Facial Displays of Emotion (Beaupre,
Cheung,& Hess, 2000), and a set developed by us (Adams & Kleck,
2001). All stimuluspersons were of Europeandescent. Because each
face was presentedtwice in the averted-gazecondition(left and right),
each was also presentedtwice in the direct-gazecondition to balance
out the design. Thus, a total of 240 pureexpressionsof angerand fear
were included in this study. Gaze direction was manipulatedusing
Adobe Photoshop.
Blendedexpressions. Eight male and 8 female "blended"target
faces were also used in this study.Each face displayedanger and fear,
which were blended using a morphing algorithm included in the
Morph 2.5 software for the Macintosh. Exemplar faces were selected from the Picturesof FacialAffect (Ekman& Friesen, 1978) and
the MontrealSet of Facial Displays of Emotion(Beaupreet al., 2000).
The angerand fear expressionsfor each stimuluspersonwere blended
at approximatelyequal levels. The resultingimages were inspectedby
an expert in the Facial Action Coding System (FACS; see Ekman &
Friesen, 1978) and were verifiedto be physically viable, ecologically
valid expressions. Each blended expression was presented in three
gaze conditions (left, right, and direct), and the direct-gaze face for
each stimulusperson appearedtwice to balance out the design. Thus,
64 presentationsof blended expressions were randomlyintermingled
among the presentationsof the 240 pureemotionalexpressions.
All pure and blendedexpressions were digitized, croppedin order
to display only the head and neck of each individual,and presentedin
blackandwhiteon a computerscreenat an approximatesize of 4 X 3 in.

Procedure

Table 1. Mean responselatencies (in milliseconds)to correctly


label anger andfear expressions,as a function of
gaze direction
Response latency

Standarderror

Type of emotional
expression

Direct
gaze

Averted
gaze

Direct
gaze

Averted
gaze

Anger
Fear

862.3
944.5

914.1
891.2

23.5
27.5

25.6
24.4

gaze, r(31) = 4.88,/? < .0001, r = .66. Fear expressions, in contrast,


were more quickly decoded when displayed in conjunction with
avertedthanwith directgaze, f(31) = -5.27, p < .0001, r = .69.

Blendedexpressions
The use of emotion-blendedexpressionsallowed us to test whether
gaze direction could shift the relative perceptualdominance of one
emotion over the other. Using the proportionof fear versus anger labels as the dependentvariableof interest,we computeda directr-test
comparison. Blended expressions were given approximately equal
numbersof fear labels and anger labels when displaying direct gaze
(M = .51, SE = .025), whereas they were given more fear labels than
angerlabelswhendisplayingavertedgaze (M - .68, SE = .028), r(31) =
5.86, p<. 0001, r= .72.

Eachparticipantwas seatedapproximately24 in. in frontof a 15-in.


Discussion
monitorwith a serial mouse for data acquisitionwith millisecond prePro
cision. Stimulus trials were presented using Superlab
(see
Study 1 offers preliminaryevidence for the role of gaze direction
Haxby, Parasuraman,Lalonde, & Abboud, 1993). Participantswere in facilitatingthe processing of facial displays of emotion. The reacinstructedto indicatevia a rightor left mouse click whethereach face tion time datafor labeling pureangerand fear expressionsshowed that
displayed anger or fear. Each face was presentedin the center of the gaze directly influences the time it takes to make correct emotion
computerscreen.Each trialbegan with a 500-ms fixationpoint, which judgments.The categoricalindex for the blended expressions showed
was immediatelyreplacedby a blank screen for 50 ms before the on- that gaze directioncan shift the relative perceptualdominanceof one
set of the stimulus face. The face remainedon the screen until a re- emotion over another.These findingssuggest thatgaze directioninflusponse was made. Participantswere asked to label each face as ences the processingof facial angerand fear displays, in termsof both
quickly and as accuratelyas possible.
processingspeed and perceptualinterpretation.To replicatethese findings and firmly establish that these processing differences are not restrictedto threat-relatedemotional expressions (perhapsas partof an
Results
early-warningmechanism),we conducted Study 2 using differentapemotionalexpressions,joy and sadness.
Pure expressions
proach-and avoidance-oriented
Priorto analyses, the data were trimmedand log-transformed.ReSTUDY 2
sponses more than 3 standarddeviations above or below the mean
were excludedfrom analysis (0.76%), as were trialsresultingin incorMethod
rect responses (10.1%). For ease of interpretation,we converteddata
back into millisecondsfor reportingthe means and standarderrors.
Participants
The centralhypothesis was tested with a 2 (anger vs. fear expresSeventeen male and 11 female undergraduatestudents were resion) X 2 (directvs. avertedgaze) repeatedmeasuresanalysis of variance. A significantmain effect of emotion emerged,F(l, 31) = 8.05, cruited. Participantsreceived either partialcourse credit or $7. They
p = .01, r = .45; angerfaces were correctlylabeledmorequickly(M = came to the laboratoryin small groupsof no more than6 people.
888.2 ms, SE = 23.9 ms) than were fear faces (Af = 917.9 ms, SE =
25.5 ms). This main effect was qualified by the predictedinteraction
Materials
betweenemotion andgaze direction,F(l, 31) = 40.37,/? < .0001, r =
As in Study 1, facial stimuli displaying pure facial expressions (in
.75 (see Table 1). As expected, anger expressions were more quickly
decoded when displayed in conjunctionwith direct than with averted this case, joy or sadness) were used. All stimuli were selected from the
645

VOL. 14, NO. 6, NOVEMBER2003

This content downloaded from 129.170.195.67 on Thu, 03 Dec 2015 15:16:34 UTC
All use subject to JSTOR Terms and Conditions

PSYCHOLOGICAL SCIENCE

Gaze and Emotion Processing


same facial sets and preparedaccording to same parametersas described for Study 1. Blended expressions combiningjoy and sadness
expressions, however, did not yield suitable stimuli, and therefore
were not included in this study.Notably, researchershave historically
had difficulty blending expressions of positive and negative emotions
(Nummenmaa,1990).

rect than with averted gaze, whereas fear and sadness expressions
(which are avoidanceoriented) were more quickly labeled when presented with averted than with direct gaze. The finding that gaze and
facial expression are combined in the perceptualprocessingand interpretation of emotion suggests a process that is potentially highly
evolved. As noted, a fundamentaldimensionof behavioralintentionis
approachand avoidance.The ability to detect another'sintentionto either approachor avoid is seen by many researchersas a principalmeProcedure
diating factor governing social interaction.Given that both emotion
The task was identical to the task in Study 1, except thatjoy and expressionand gaze behaviorreflectthese underlyingapproach-avoidsadness emotionjudgments were made and participantssaw a total of ance motivationalintentions, the integrationof these cues in social
240 ratherthan 304 stimuli.
perceptionlikely subservesan adaptivefunction.
The present findings might seem counterintuitivegiven recent researchon gaze direction.For instance,Macraeet al. (2002) used a genResults
der detectiontask to test the effects of gaze directionon the processing
The dependent measure of interest was the mean latency of reefficiency of personconstrual.They found thatparticipantswere faster
sponse for correctlylabeling faces as displaying eitherjoy or sadness. to correctlylabel the genderof faces displayingdirectgaze thanfaces
Priorto analyses, the datawere trimmedand log-transformed.Outliers
displayingavertedor closed eyes. In addition,Driveret al. (1999) used
(1.2%) and incorrectresponses (5.1%) were droppedfrom the analy- a letterdiscriminationtask to test the effects of
gaze directionas an atses. As for Study 1, the means and standarderrorsreportedhere were tentional
that
the
device.
found
cuing
They
gaze directionof a stimulus
transformedback to milliseconds.
face triggersa reflexive shift of attentionin the same directionin the
To test the hypothesis that gaze would influence the speed with observer.Fromthese
studies,one mightconcludethatdirectgaze ought
which facial expressions were labeled, we computeda 2 (joy vs. sad- to facilitatethe
processingof all facial displays of emotion as a funcness expression) X 2 (directvs. avertedgaze) repeatedmeasuresanal- tion of where the observer'sattentionalresourcesare
allocated;direct
ysis of variance. A main effect for emotion emerged, F(l, 27) = gaze seems to shift attentionalresources toward the face, whereas
15.26,p < .0001, r = .6; joyous faces (M = 609.5 ms, SE = 16.0 ms) averted
gaze appearsto shift attentionalresourcesaway. The current
werecorrectlylabeledmorequicklythansad faces (M = 633.9 ms, SE =
studies, however, demonstratethat the influence of gaze directionon
15.2 ms). The predictedinteractionbetween emotion and gaze direc- the
perceptionof facial displays of emotion varies dependingon the
tion was found, F(l, 27) = 20.97, p < .0001, r = .66 (see Table2). As motivationalorientationassociatedwith the emotion
being expressed.
expected,joy expressionswere more quickly decoded when displayed
of facial informationappearto be indepenmany
Although
aspects
with directthan avertedgaze, f(27) = 3.51, p < .01, r = .56. Sadness
dently processed (e.g., lip reading,emotional expression,gender,and
expressions were more quickly decoded when displayed with averted age; Bruce & Young, 1986;Young, 1998), the currentfindingssuggest
thandirectgaze, r(27) = -2.63,/? < .02, r = .45.
that gaze direction and facial expressions of emotion are not. By
mergingthe study of facial expressionswith the studyof perceptionof
Discussion
gaze direction, the currentresearchconfirms that these cues interact
The results for joy and sadness expressions are consistent with meaningfullyin the perceptualprocessing of emotionally relevantfathose for anger and fear expressions, reportedin Study 1. The laten- cial information.This finding is importantfor a numberof reasons.It
cies to correctlylabel the emotionalexpressionsvariedas a joint func- suggests that gaze directionis an importantcue in the perceptualprotion of gaze direction and whether the facial display in question cessing of facial displays of emotion, which has not been previously
demonstratedin the researchliterature.It also suggests thatthe effects
representedan approach-or avoidance-orientedemotion.
of gaze directionon the processing of emotion dependon the specific
type of emotion displayedby the face. An understandingof the precise
GENERAL DISCUSSION
nature of this interactionand the brain mechanisms that support it
In the currentstudies, anger and joy expressions (which are ap- awaits furtherresearchdevelopments.
proachoriented) were more quickly labeled when presentedwith di-

Table 2. Mean responselatencies (in milliseconds)to correctly


labeljoy and sadness expressions,as a function of
gaze direction
Response latency

Standarderror

Type of emotional
expression

Direct
gaze

Averted
gaze

Direct
gaze

Averted
gaze

Joy
Sadness

600.8
641.4

618.3
626.4

15.5
15.2

17.0
15.8

Acknowledgments - The authors would like to thank Katharine D. Adams


and Abigail A. Marsh for helpful comments on an earlier draft; Brent F.
Jones for help creating the stimuli and implementing the studies; and
Punam A. Keller, C. Neil Macrae, Jennifer A. Richeson, and George L.
Wolford for suggestions regarding experimental design and data analysis.
This research was supported in part by a Doctoral Dissertation Improvement Grant from the National Science Foundation (Award No. 0121947) to
Reginald B. Adams, Jr., and by a Rockefeller Reiss Family Senior Faculty
grant to Robert E. Kleck.

REFERENCES
Adams, R.B., Jr., & Kleck, R.E. (2001). [Youngadult facial displays]. Unpublishedset of
photographs.

646

VOL. 14, NO. 6, NOVEMBER2003

This content downloaded from 129.170.195.67 on Thu, 03 Dec 2015 15:16:34 UTC
All use subject to JSTOR Terms and Conditions

PSYCHOLOGICALSCIENCE

Reginald B. Adams, Jr., and Robert E. Kleck


Argyle, M., & Cook, M. (1976). Gaze and mutualgaze. New York:CambridgeUniversity
Press.
Baron-Cohen,S. (1997). Mindblindness:An essay on autism and theory of mind. Cambridge,MA: MIT Press.
M., Cheung,N., & Hess, U. (2000, October).La reconnaissancedes expressions
Beaupre",
emotionnellesfaciales par des decodeursafricains, asiatiques et caucasiens. Poster
presentedat the annual meeting of the Socie'te'Que'be'coisepour la Recherche en
Psychologie, Hull, Quebec, Canada.
Brothers,L., & Ring, B. (1992). A neuroethologicalframeworkfor the representationof
minds.Journalof CognitiveNeuroscience,4, 107-1 18.
Bruce, V., & Young, A. (1986). Understandingface recognition.British Journal of Psychology, 77, 305-327.
Davidson, R.J., & Hugdahl, K. (Eds.). (1995). Brain asymmetry.Cambridge,MA: MIT
Press.
Driver,J., Davis, G., Ricciardelli,P., Kidd,P., Maxwell,E., & Baron-Cohen,S. (1999). Gaze
perceptiontriggersreflexivevisuospatialorienting.VisualCognition,6, 509-540.
Ekman,P.F.,& Friesen,W.V. (1978). The Facial Action Coding System:A techniquefor
the measurementof facial movement. Palo Alto, CA: Consulting Psychologists
Press.
Fehr,B.J., & Exline, R.V. (1987). Social visual interaction:A conceptualand literaturereview. In A.W. Siegman & S. Feldstein(Eds.), Nonverbalbehaviorand communication (pp. 225-325). Hillsdale, NJ: Erlbaum.
Harmon-Jones,E., & Segilman, J. (2001). State anger and prefrontalbrain activity: Evidence that insult-relatedrelativeleft-prefrontalactivationis associated with experienced angerand aggression.Journalof Personalityand Social Psychology,80, 797803.
Haxby,J.V., Parasuraman,R., Lalonde,F., & Abboud,H. (1993). SuperLab:General-purpose Macintosh software for human experimentalpsychology and psychological
testing.BehaviorResearchMethods,Instruments,& Computers,25, 400-405.

Hess, E.H., & Petrovich,S.B. (1987). Pupillarybehaviorin communication.Hillsdale, NJ:


Erlbaum.
Hinde, R.A., & Rowell, T.E. (1962). Communicationby posture and facial expression in
the rhesus monkey.Proceedingsof the Zoological Society of London,138, 1-21.
Janik, S.W., Wellens, A.R., Goldberg,M.L., & Dell'Osso, L.F. (1978). Eyes as the center
of focus in the visual examinationof human faces. Perceptual& Motor Skills, 47,
857-858.
Kirouac,G., & Dore*,F.Y.(1984). Judgmentof facial expressionsof emotion as a function
of exposuretime. Perceptual& MotorSkills, 59, 147-150.
Kleinke, C.L. (1986). Gaze and eye contact: A researchreview. Psychological Bulletin,
100, 78-100.
Macrae, C.N., Hood, B.M., Milne, A.B., Rowe, A.C., & Mason, M.F. (2002). Are you
looking at me? Gaze and personperception.Psychological Science, 13, 460-464.
Morton,J., & Johnson,M. (1991). CONSPECand CONLEARN:A two-processtheoryof
infantface recognition.Psychological Review,98, 164-181.
Nummenmaa,T. (1990). Senderrepertoiresof pureand blendedfacial expressionsof emotion. ScandinavianJournalof Psychology,31, 161-180.
Perrett,D.I., & Emery,N.J. (1994). Understandingthe intentionsof othersfrom visual signals: Neurophysiologicalevidence. Cognitive/CurrentPsychology of Cognition,13,
683-694.
Redican,W.K. (1982). An evolutionaryperspectiveon humanfacial displays. In P. Ekman
(Ed.), Emotion in the humanface (pp. 212-280). New York:CambridgeUniversity
Press.
van Hoof, J.A.R.A.M. (1972). A comparativeapproachto the phylogeny of laughterand
smiling. In R.A. Hinde (Ed.), Non-verbalcommunication(pp. 209-241). New York:
CambridgeUniversityPress.
Young,A.W. (1998). Face and mind. Oxford, England:Oxford UniversityPress.

(Received 10/16/02; Revision accepted 1/10/03)

647

VOL. 14, NO. 6, NOVEMBER2003

View publication stats

This content downloaded from 129.170.195.67 on Thu, 03 Dec 2015 15:16:34 UTC
All use subject to JSTOR Terms and Conditions

Você também pode gostar