Você está na página 1de 16

Cognition and Emotion

ISSN: 0269-9931 (Print) 1464-0600 (Online) Journal homepage: http://www.tandfonline.com/loi/pcem20

Categorical and dimensional perceptions in

decoding emotional facial expressions

Tomomi Fujimura , Yoshi-Taka Matsuda , Kentaro Katahira , Masato Okada &

Kazuo Okanoya

To cite this article: Tomomi Fujimura , Yoshi-Taka Matsuda , Kentaro Katahira , Masato Okada
& Kazuo Okanoya (2012) Categorical and dimensional perceptions in decoding emotional facial
expressions, Cognition and Emotion, 26:4, 587-601, DOI: 10.1080/02699931.2011.595391

To link to this article: http://dx.doi.org/10.1080/02699931.2011.595391

Copyright Psychology Press Ltd

Published online: 09 Aug 2011.

Submit your article to this journal

Article views: 1062

View related articles

Citing articles: 6 View citing articles

Full Terms & Conditions of access and use can be found at


Download by: [Universitara M Emineescu Iasi] Date: 06 March 2017, At: 13:00
2012, 26 (4), 587601

Categorical and dimensional perceptions in decoding

emotional facial expressions

Tomomi Fujimura1,2, Yoshi-Taka Matsuda1,2, Kentaro Katahira1,3,

Masato Okada1,3, and Kazuo Okanoya1,2,4
Japan Science Technology Agency, ERATO, Okanoya Emotional Information Project, Saitama,
RIKEN Brain Science Institute, Saitama, Japan
Graduate School of Frontier Sciences, The University of Tokyo, Kashiwa, Chiba, Japan
Department of Life Sciences, Graduate School of Arts and Sciences, The University of Tokyo,
Tokyo, Japan

We investigated whether categorical perception and dimensional perception can co-occur while
decoding emotional facial expressions. In Experiment 1, facial continua with endpoints consisting of
four basic emotions (i.e., happinessfear and angerdisgust) were created by a morphing technique.
Participants rated each facial stimulus using a categorical strategy and a dimensional strategy. The
results show that the happinessfear continuum was divided into two clusters based on valence, even
when using the dimensional strategy. Moreover, the faces were arrayed in order of the physical
changes within each cluster. In Experiment 2, we found a category boundary within other continua
(i.e., surprisesadness and excitementdisgust) with regard to the arousal and valence dimensions.
These findings indicate that categorical perception and dimensional perception co-occurred when
emotional facial expressions were rated using a dimensional strategy, suggesting a hybrid theory of
categorical and dimensional accounts.

Keywords: Hybrid theory of emotion; Dimension; Category; Facial expressions.

In recent years, psychology has been dominated whereas the dimensional theory proposes funda-
by two theories of emotion: a categorical theory mental dimensions that constitute emotional
and a dimensional theory. The categorical theory spaces (Russell, 1980; Russell & Bullock, 1985).
proposes the presence of six basic, distinct, and Dimensional theorists posit two fundamental di-
universal emotions (Ekman, 1992; Ekman & mensions: valence, which represents the hedonic
Friesen, 1971, 1976; Ekman, Sorenson, & Friesen, tone or pleasantnessunpleasantness continuum,
1969): happiness, anger, sadness, surprise, disgust, and level of arousal (Russell, 1980; Russell &
Bullock, 1985) or tension (Schlosberg, 1954),
and fear (Ekman & Friesen, 1971; Johnson-Laird
which refers to the level of energy.
& Oatley, 1992; Tomkins & McCarter, 1964),

Correspondence should be addressed to: Kazuo Okanoya, Japan Science and Technology Agency, ERATO, Okanoya
Emotional Information Project, RIKEN, 21 Hirosawa, Wako-shi, Saitama, 3510198, Japan. E-mail: okanoya@brain.riken.jp

# 2012 Psychology Press, an imprint of the Taylor & Francis Group, an Informa business 587
http://www.psypress.com/cogemotion http://dx.doi.org/10.1080/02699931.2011.595391

Research on facial expressions, a powerful multidimensional scaling, which enables visualisa-

medium for emotional communication, has de- tion of similarities or dissimilarities in the data
monstrated categorical and dimensional perception by locating the more similar stimuli closer to
in decoding facial expressions. For example, based one another on a plot. Takehara and Suzuki
on the categorical-perception effect (Harnad, (1997) indicated that each original facial expres-
1987), Etcoff and Magee (1992) explored catego- sion created a circumplex structure that included
rical perception of facial expressions by morphing both valence and arousal dimensions. Notably,
images from two face drawings into prototypes they also found that the morphed images did not
representing the six basic emotions. Two tasks, cluster into categories, but were located interme-
identification and discrimination, were used to diately between the two original facial expres-
assess categorical perception. During the identifi- sions. This result suggests that facial expressions
cation task, participants identified each stimulus are recognised dimensionally. The dimensional
from two alternative options derived from proto- account of facial expression recognition has been
typical facial expressions. During the discrimina- replicated (Katsikitis, 1997; Takehara & Suzuki,
tion task, they discriminated pairs of stimuli 2001).
along a continuum based on the physical features Thus, evidence supporting both categorical
of faces. Etcoff and Magee (1992) found that and dimensional approaches to facial expression
participants were able to discriminate particular recognition has been produced. In response to
pairs of stimuli more accurately than other pairs the conflict between these approaches, a hybrid
of stimuli and they identified a categorical bound- of the categorical and dimensional theories
ary that separated the two. That is, they found has been proposed (Christie & Friedman, 2004;
evidence for a human ability to categorise emo- Panayiotou, 2008; Russell, 2003). According to the
tional facial expressions, which was particularly hybrid theory, people can use both categorical
robust in the case of facial expressions depicted perception and dimensional perception to decode
in photographs (Calder, Young, Perrett, Etcoff, facial expressions. However, the relative domi-
& Rowland, 1996; Young et al., 1997). In recent nance of categorical or dimensional perception and
years, the nature of categorical perception of facial the ways in which these modes interact with each
expressions has been further explored. Roberson other remain unclear.
and Davidoff (2000) found that verbal interfe- Interpretations of previous research findings
rence during a discrimination task diminished the showing that either categorical or dimensional
categorical-perception effect, but visual interfer- perception is dominant should consider the re-
ence did not, indicating that categorical percep- sponse formats used for the psychological ratings
tion involves a verbal code representing target and the methods used for the data analysis. For
expressions. As this finding has been replicated example, identification tasks require participants
(Roberson, Damjanovic, & Pilling, 2007), it to classify a facial expression into one emotional
appears that the categorical-perception effect, in category. Discrimination tasks can assess the ability
which cross-category pairs are discriminated bet- to detect a difference between two facial expres-
ter than are within-category pairs, emerges from sions. These explicit strategies to evaluate facial
the generation of verbal coding (see Roberson, expressions may lead to evidence of categorical
Damjanovic, & Kikutani, 2010, for a review). perception. On the other hand, multidimensional
Contrary to accounts focusing on categorical scaling is appropriate to an alternative approach
perception, several studies have reported that in which emotional space is seen as derived
emotional facial expressions were recognised from similarities of facial stimuli, and valence and
based on dimensions of emotional space. Take- arousal dimensions are viewed as arbitrarily deter-
hara and Suzuki (1997) collected psychological mined (Katsikitis, 1997; Takehara & Suzuki,
ratings in response to morphed facial images of 1997). Acknowledgement that the response format
a real person. They analysed the data using and the method of data analysis may influence

588 COGNITION AND EMOTION, 2012, 26 (4)


evidence for categorical or dimensional perception each cluster should show linear changes, indicat-
is crucial in evaluating research on the recognition ing that dimensional perception partly contributed
of facial expressions. to a categorical strategy. Resolution of these issues
The present study investigated how a hybrid can provide useful evidence supporting the ex-
of categorical and dimensional theories can be istence of a hybrid of categorical and dimensional
applied to the recognition of facial expressions. theories.
The facial stimuli were created by a morphing
technique so that they could be interpreted from
both categorical and dimensional perspectives.
In Experiment 1, two facial continua represented
Experiment 1 investigated whether categorical or
changes in valence and arousal. The endpoints
dimensional perception occurs irrespective of the
of these continua were members of the six
response format (i.e., the categorical or dimen-
basic emotions (i.e., happinessfear for valence
sional strategy). To allow the two types of
and angerdisgust for arousal). In Experiment 2,
perception to be demonstrated, facial continua
two additional facial continua, surprisesadness
representing valence and arousal were created
and excitementdisgust, were constructed from
from facial depictions of four basic emotions.
high- and low-arousal expressions to represent a
fuller range of the arousal dimension. In terms
of response formats, we employed identification
and discrimination (e.g., Young et al., 1997) tasks Participants
for the categorical strategy and the Affect Grid Twenty-two adults (12 men and 10 women;
(Russell, Weiss, & Mendelsohn, 1989) for the Mage 33.39SD 4.17 years) participated in the
dimensional strategy. The Affect Grid is a means study. They were recruited by advertisements
of assessing affect along the dimensions of valence placed by an intermediary company, and their
and arousal. occupational backgrounds varied widely. They
To test the hybrid theory, we examined received a reward for participating in the experi-
whether categorical or dimensional perception ment. All of the participants were native Japanese
was totally dependent on response format. If so, speakers and had normal or corrected-to-normal
category boundaries within each continuum would vision.
be found only by use of the identification and
discrimination tasks, and each stimulus would Facial stimuli
be located along the continuum defined by the The emotional faces were depicted by two
valence or arousal dimension of the Affect Grid. models: a man selected from Pictures of Facial
On the other hand, if categorical perception and Affect (Ekman & Friesen, 1976) and a woman
dimensional perception can occur irrespective of drawn from examples of facial stimuli (Russell,
response format, it is possible that the two co- 1997). To create continua that changed according
occur within one strategy. That is, if people use to valence and arousal, we chose expressions of
both categorical perception and dimensional per- happiness and fear and of anger and disgust as
ception simultaneously, a continuum of facial endpoints. These are also prototypical facial
stimuli should be divided into two clusters, and expressions that are included in the six basic
facial stimuli should be simultaneously arranged emotions (Ekman, 1992; Ekman & Friesen,
within each cluster according to their emotional 1971). The disgusted expression portrayed by
intensity on the Affect Grid. With respect to the woman was originally defined as sadness or
the identification and discrimination tasks, we fatigue by Russell (1997). However, this face
should find an explicit categorical boundary on was identified by Japanese participants as demon-
the facial-stimuli continuum. In addition, task strating disgust (Ogawa, Fujimura, & Suzuki,
performance in response to each stimulus within 2005). Therefore, we decided to define the face

COGNITION AND EMOTION, 2012, 26 (4) 589


as disgust in this study because all of our Windows XP . Stimuli were presented on a 19-
participants were native Japanese speakers. inch LCD monitor (E1902S, Iiyama; 1024768
To represent the emotional continua accor- pixels, 75 Hz refresh rate) and subtended a visual
ding to valence and arousal, seven morphed angle of about 10.087.38.
images were created for each pair of faces. Each
continuum consisted of nine photographs of faces Procedure
(i.e., the original faces and morphed images; see Participants rated facial stimuli using three types
Figure 1). For example, the happinessfear con- of tasks: the identification task, the ABX dis-
tinuum included faces blending the two emotions crimination task, and the Affect Grid. No time
in the following proportions: 100:0 (happiness restrictions were applied. Four training trials were
100%, fear 0%), 87.5:12.5 (happiness 87.5%, conducted before each task. The order of the three
fear 12.5%), 75:25 (happiness 75%, fear 25%), tasks was counterbalanced across participants.
62.5:37.5 (happiness 62.5%, fear 37.5%), 50:50
(happiness 50%, fear 50%), 37.5:62.5 (happiness Identification task. Participants were asked to
37.5%, fear 62.5%), 25:75 (happiness 25%, fear identify a facial expression by choosing between
75%), 12.5:87.5 (happiness 12.5%, fear 87.5%), the two emotions on the endpoints of the con-
and 0:100 (happiness 0%, fear 100%). We defined tinuum to which the depiction belonged. For
the facial stimuli using 12.5% increments in the example, participants identified the happiness
degree of morphing with respect to happiness or 87.5% stimulus as either happiness or fear.
anger. The happinessfear continuum represents Each trial began with a 250 ms presentation
change in the pleasantnessunpleasantness conti- of a fixation point, followed by a 250 ms blank
nuum, and the angerdisgust continuum represents screen, and then a 300 ms facial stimulus. After a
change in the arousalsleepiness continuum. 250 ms mask consisting of a cluster of asterisks,
two emotional words were presented, and partici-
Apparatus pants chose one word by pressing the assigned
Experimental events were controlled by a program button. Each face was presented eight times in
written in Inquisit 3.0 (Millisecond) and were random order, yielding a total of 288 trials.
implemented on a computer (Vostro 420, Dell) These trials were divided into four blocks on the
using the Microsoft Windows operating system, basis of two models and the two continua (i.e.,


Happiness 0% 12.5% 25% 37.5% 50% 62.5% 75% 87.5% 100%

Fear 100% 87.5% 75% 62.5% 50% 37.5% 25% 12.5% 0%


Anger 0% 12.5% 25% 37.5% 50% 62.5% 75% 87.5% 100%

Disgust 100% 87.5% 75% 62.5% 50% 37.5% 25% 12.5% 0%

Figure 1. Facial stimuli used in this experiment. The original faces were drawn from Ekman and Friesen (1976).

590 COGNITION AND EMOTION, 2012, 26 (4)


happinessfear and angerdisgust). The order of between paired faces (e.g., happiness 100% and
the four blocks was randomly determined across happiness 75%), resulting in seven possible pairs on
participants. each continuum. The third face, X, was always
identical to either A or B. Four presentation ord-
ABX discrimination task. The ABX discrimina- ers were possible: (ABX) (ABA) (ABB) (BAA)
tion task required participants to discriminate (BAB). The same order was presented twice for
between faces on a continuum. Each trial began each pair, yielding a total of 56 trials for each
with a fixation point presented for 250 ms, continuum. One block consisted of pairs from
followed by a blank screen lasting 250 ms, and one continuum, resulting in a total of four blocks.
then three successive images of faces. The first The order of trials within a block and the blocks
(A) and second (B) faces were presented for themselves were randomised across participants.
300 ms each, and the third (X) face was presented
for 1000 ms. The blank interval between A and Affect Grid. The 99 Affect Grid assesses affect
B was 250 ms, that between B and X was along the dimensions of valence and arousal
1000 ms, and that after X was 250 ms. Participants (Russell et al., 1989). Participants were asked to
were asked to press a response button to indicate rate the emotion expressed by a face by using
whether X matched A or B. a computer mouse to select the appropriate loca-
In each trial, facial stimuli A and B differed by tion on a two-dimensional square representing
two steps on a continuum, yielding a 25% gap emotional space. Each trial began with a 250 ms

(a) HappinessFear AngerDisgust

100 100
Identification Rate (%)

Identification Rate (%)

80 80

60 Happiness 60 Anger
40 Fear 40 Disgust

20 20

0 0

(b) HappinessFear AngerDisgust

100 100
Discrimination Rate (%)
Discrimination Rate (%)

80 80

60 60

40 Observed 40 Observed
Predicted Predicted
20 20

0 0

Figure 2. (a) The mean identification rates for the happinessdisgust and angerdisgust continua. These rates show the frequency of the
identification of happiness or fear and anger or disgust. Labels along the x-axis indicate the percentage of a particular emotion in facial
stimuli. For example, Ha 87.5 means that happiness represents 87.5% of the face and that fear represents 12.5% of the face. (b) The mean
of the observed discrimination rates and the predicted data for each continuum. These rates were based on the frequency of correct responses in
the ABX discrimination task. The labels show which facial stimuli were paired. For example, Ha 5025 indicates a trial in which
happiness 50% and happiness 25% were presented as A and B, respectively.

COGNITION AND EMOTION, 2012, 26 (4) 591


fixation point, followed by a 250 ms blank screen, we obtained the predicted performance on the
and then a facial stimulus presented for 300 ms. discrimination task. If these predicted values
Following a mask of asterisks lasting 250 ms, the correlated with the observed ABX discrimination
Affect Grid was displayed until the participant data, we could conclude that categorical perception
responded. Each facial stimulus was presented occurred within that continuum.
twice in random order, yielding a total of 72 trials The bottom portion of Figure 2 shows the
that were divided into two blocks based on our predictions and the mean actual correct rates for
models. The order of blocks was counterbalanced the discrimination task. As the observed and
across participants. predicted curves seem to fit, correlations between
the observed and predicted results for each con-
tinuum were significant, happinessfear: r .90,
t(5) 4.62, pB.01; angerdisgust: r .85,
The upper portion of Figure 2 shows the mean t(5) =3.67, pB.01. That is, categorical perception
percentages for two identified emotions on each made at least some contribution to responses to
continuum: happiness or fear and anger or disgust. the facial stimuli within each continuum. If
Visual inspection shows that identification rates categorical perception occurs for each continuum,
were nonlinearly distributed, indicating an abrupt participants should discriminate better between a
category shift. In terms of the happinessfear pair of facial stimuli that cross a categorical
continuum, happiness 62.5% or 50% seemed to boundary than between pairs of facial stimuli
constitute the category boundary between happi- within the same category. To confirm this hypoth-
ness and fear. Identification rates for the anger esis, the peak correct rate was contrasted with
disgust continuum also showed patterns similar the mean of the correct rates on all the other pairs.
to those manifested in response to the happiness A t-test revealed that the correct rate for the
fear continuum. happiness 75% and 50% pair was significantly
To assess the occurrence of categorical percep- higher than for other pairs on the happinessfear
tion, we applied a method used in previous studies continuum, t(21) 3.12, pB.01. On the anger
(Calder et al., 1996; Young et al., 1997). First, we disgust continuum, performance for anger 75%
predicted subjects performance in the ABX dis- and 50% was also significantly better than that for
crimination task on the basis of the identification all other pairs, t(21) =4.83, pB.01. These results
and ABX discrimination data. This approach indicate that happiness 62.5% and anger 62.5%
assumes that two factors determine the ability to may constitute a category boundary for each
discriminate between two facial expressions: the continuum.
physical differences between pairs of facial stimuli The mean scores on the Affect Grid are
irrespective of their expressions, and the contribu- shown in Figure 3a. Facial stimuli within the
tion of the categorical perception of facial expres- happinessfear or the angerdisgust continuum
sions. To estimate the first factor, we employed changed in accordance with valence or arousal.
the mean of the discrimination rates for the pairs Notably, the happinessfear continuum seemed
at the endpoints of each continuum. Categorical to show a gap between happiness 62.5% and
perception did not contribute significantly to 50% and happiness 50% and 37.5%, indicating
these results because these stimuli were near the likelihood of a category boundary. To visualise
prototypical facial expressions. For the second the distributions of the data on the Affect Grid,
factor, we calculated the differences between the frequency histograms for the happinessfear and
identification rates for the two relevant stimuli the angerdisgust continua, in which each facial
in each pair and multiplied the difference by stimulus was rated on a grid of a dimension,
0.25 (a constant). By summing the two estimates, are shown in Figure 3b. The rating data for each

592 COGNITION AND EMOTION, 2012, 26 (4)


(a) 9

8 Ha0

7 An87.5 Ha37.5
An75 Ha50
Ha62.5 Ha87.5
6 An62.5 Ha100
An50 Ha75
5 An37.5

3 AngerDisgust

1 2 3 4 5 6 7 8 9
Unpleasantness Pleasantness

(b) HappinessFear AngerDisgust

0.25 0.25

0.2 0.2
Probability Density

Probability Density

0.15 0.15

0.1 0.1

0.05 0.05

0 0
0 2 4 6 8 10 0 2 4 6 8 10
Valence Rating Arousal Rating

Distance of Valence or Arousal

2.5 AngerDisgust



Figure 3. The results for Experiment 1. (a) The mean ratings on the Affect Grid. (b) Histograms showing how frequently each facial
stimulus was assigned on a grid for the valence or the arousal scale. The data were obtained from all participants and were averaged for each
type of stimulus across all participants. Solid lines show normal distributions. (c) The mean distances between faces using a two-step method
of the valence ratings for the happinessfear continuum and the arousal ratings for the angerdisgust continuum. The label 10075 refers
to the distance between happiness (or anger) 100% and happiness (or anger) 75%, respectively, on valence or arousal ratings.

COGNITION AND EMOTION, 2012, 26 (4) 593


morphing of the stimuli were averaged for each Discussion

participant. We applied a Gaussian-mixture
Consistent with previous studies (Calder et al.,
model (GMM; McLachlan & Basford, 1988),
1996; Young et al., 1997), we found a category
which uses models of probabilities to account for
boundary on the happinessfear and anger
clustering in the distribution of data. The Baye-
disgust continua in the identification and discri-
sian information criterion (BIC) was used to
mination tasks. The identification performance on
evaluate the fitness of a model. Smaller BIC values
both continua of stimuli showed a rapid shift in
indicate more appropriate models. We found two
the middle of each continuum to the prototypical
normal distributions in the data for the happiness
facial expressions, indicating non-linear change.
fear continuum (single Gaussian distribution:
BIC852.23; two Gaussian distributions: BIC If psychological ratings reflected the physical
835.37). On the other hand, one normal distribu- features of facial stimuli, identification rates for
tion appeared for the angerdisgust continuum each facial stimulus would correspond to the
(single Gaussian distribution: BIC596.73; two intensity of emotion depicted in the facial stimu-
Gaussian distributions: BIC604.06). Valence lus. However, we found a steep shift in the
was divisible into two clusters, whereas arousal identification rates on the happinessfear and
was represented on a continuum. the angerdisgust continua, which likely consti-
To verify the occurrence of categorical percep- tuted category boundaries. The discrimination
tion even in the presence of ratings based on data show that participants discriminated better
continua in emotional space, we calculated the between a pair of facial stimuli that crossed a
distance between facial stimuli differing by two category boundary than between those within a
steps on each continuum with respect to valence or category boundary. Consequently, categorical per-
arousal ratings. Figure 3c shows the mean distance ception contributed to the discrimination of facial
of the pairs of facial stimuli with respect to valence expressions over and above the contribution of
ratings on the happinessfear continuum and with physical differences. Moreover, significant corre-
respect to arousal ratings on the angerdisgust lations between the observed and the predicted
continuum. We conducted an analysis identical to discrimination rates for both continua emerged.
that used in the ABX discrimination task to These results provide evidence of the categorical-
confirm whether the distance between happiness perception effect.
67.5% and 37.5% on the valence rating was the With respect to the Affect Grid ratings, we
largest on the happinessfear continuum. The found that the happinessfear continuum was
peak distance was compared to the mean distances divided into two clusters in terms of valence
of all the other pairs combined. A t-test showed ratings. The gap between happiness 67.5% and
that the distance between happiness 62.5% 37.5% was larger than that between other pairs,
and 37.5% was significantly larger than was the indicating that happiness 50% was a boundary
average distance across other pairs on the hap- between pleasantness and unpleasantness. In ad-
pinessfear continuum, t(21)4.69, pB.01. dition, the distribution of valence ratings on the
Moreover, we found a significant difference in continuum was divided into two clusters based on
the distance on the arousal dimension between the fit of two normal distributions. These results
surprise (excitement) 62.5% and 37.5% and the provide evidence that categorical perception oc-
average distances between all other pairs, surprise curred in response to the happinessfear conti-
sadness: t(21)4.17, pB.01; excitementdisgust: nuum even when using a dimensional strategy.
t(21)2.98, pB.01. These findings indicate that This is the first finding to indicate that categorical
the fifty-fifty faces within the surprisesadness and perception contributes to ratings of facial expres-
the excitementdisgust continua were category sions even when an explicit dimensional strategy
boundaries on the arousal as well as on the valence such as the Affect Grid is used. This suggests
dimension. that the relative dominance of categorical or

594 COGNITION AND EMOTION, 2012, 26 (4)


(a) 9
Su100 Ex100
8 Su87.5 Ex87.5

7 Su62.5

6 Su50 Ex50
5 Ex12.5 Ex37.5
Ex0 Su37.5
Su12.5 Su25
Su0 SurpriseSadness

3 ExcitementDisgust

1 2 3 4 5 6 7 8 9
Unpleasantness Pleasantness

Figure 4a. The results of Experiment 2. (a) The mean ratings on the Affect Grid.

dimensional perception does not totally depend for the discrepancy between the results for the
on the response format (i.e., on a categorical or a happinessfear continuum and those for the an-
dimensional strategy). gerdisgust continuum. First, happy faces have
Given that categorical perception was totally superiority in identification and discrimination
dominant on the happinessfear continuum, even tasks. Of the six basic emotions, happy faces are
with a dimensional strategy, the morphed facial the most likely to be recognised with considerable
stimuli around the prototypical faces likely merged accuracy (see Ekman, 1994, for a review). Further-
(i.e., happiness and fear). However, our results more, Calvo and Marrero (2009) reported that a
also indicate that the facial stimuli were arranged happy face had a greater advantage in a visual
in the correct order, which corresponded to the search task compared with the five other basic
physical changes in each cluster. Constant physical emotions. Taken together, these findings suggest
differences in the facial continuum reflected the that it is possible that the happinessfear pairs were
intensity of happiness and fear expressed in the discriminated better than the angerdisgust pairs
facial stimuli. That is, participants may have rated due to the superiority of a happy face. Second,
facial stimuli by clustering them into pleasant and the angerdisgust continuum might be insufficient
unpleasant and then evaluated their emotional to fully represent the arousal dimension. The
intensity with regard to valence. This suggests results of the Affect Grid indicate that the facial
that categorical perception and dimensional per- stimuli constituting the angerdisgust continuum
ception co-occur in valence ratings of facial were distributed within a relatively narrow range
expressions. Consequently, our findings support a compared with those making up the happiness
hybrid theory that combines dimensional and fear continuum. Therefore, it is possible that
categorical accounts (Christie & Friedman, 2004; the arousal dimension represented by the anger
Panayiotou, 2008; Russell, 2003). disgust continuum was ineffective in eliciting
However, the angerdisgust continuum showed categorical perception in a dimensional strategy.
no boundary across clusters, suggesting that arousal To test this possibility, we used two additional
level, in contrast to valence, is continuously continua that encompassed the arousal dimension
represented. There are two possible explanations better than the angerdisgust continuum.

COGNITION AND EMOTION, 2012, 26 (4) 595


(b) SurpriseSadness
0.4 0.25


Probability Density
Probability Density
0.25 0.15


0 0
0 2 4 6 8 10 0 2 4 6 8 10
Valence Rating Arousal Rating

0.25 0.35

Probability Density

Probability Density

0.1 0.15


0 0
0 2 4 6 8 10 0 2 4 6 8 10
Valence Rating Arousal Rating

SurpriseSadness ExcitementDisgust
(c) 4 4
3.5 Valence 3.5 Valence
3 Arousal 3 Arousal


2 2
1.5 1.5
1 1
0.5 0.5
0 0

Figure 4b. (b) Histograms showing how frequently each facial stimulus was assigned on a grid for the valence and the arousal scale. The
upper panels show the distribution of data for the surprisesadness continuum. The lower panels show the data for the excitementdisgust
continuum. Solid lines show normal distributions. (c) The mean distances between two-step faces in terms of the valence and arousal ratings.
The label 10075 refers to the distance between surprise (or excitement) 100% and surprise (or excitement) 75% in valence or arousal

596 COGNITION AND EMOTION, 2012, 26 (4)


EXPERIMENT 2 four times in random order, yielding a total of 72

trials that were divided into two blocks.
Experiment 2 examined whether categorical per-
ception occurs in the context of a dimensional
strategy when rating facial expressions in terms of
arousal as well as valence. To encompass a full The mean scores on the Affect Grid for the
range of arousal, we created two facial continua. surprisesadness and the excitementdisgust con-
One blended surprised and sad faces that repre- tinua are shown in Figure 4a. Both facial continua
sented extreme high- and low-arousal facial formed clusters at the end of the continuum,
expressions. The other was constructed using providing evidence of categorical perception.
excitement and disgust faces that represented As for the distributions of the data on the
high arousal and positive expressions or moderate Affect Grid, frequency histograms of valence and
arousal and negative expressions, which were arousal ratings for the surprisesadness and
therefore located in opposite quadrants in the the excitementdisgust continua are shown in
emotional space. Figure 4b. We applied a GMM to estimate
clustering in the distribution of data in both
Method the valence and arousal dimensions. For the
surprisesadness continuum, we found two normal
distributions for the arousal data (single Gaussian
Twenty-two adults (8 men and 14 women; Mage
distribution: BIC 853.15; two Gaussian distribu-
29.59SD 4.81 years) who were staff members at
tions: BIC 813.83), but only one normal dis-
the RIKEN institute participated in the study. All
tribution for the valence data (single Gaussian
of the participants were native Japanese speakers
distribution: BIC 630.21; two Gaussian distribu-
and had normal or corrected-to-normal vision.
tions: BIC632.14). For the excitementdisgust
continuum, two normal distributions yielded a
Facial stimuli better fit than a single normal distribution for
We created two facial continua using the same the valence data (single Gaussian distribution:
morphing technique as in Experiment 1. One BIC 942.92; two Gaussian distributions:
continuum was from the original surprised and sad BIC 815.10), but a single normal distribution
faces posed by the male model in Experiment 1. was a better fit for the arousal data (single Gaussian
The other continuum was prepared by blending distribution: BIC771.20; two Gaussian distri-
the excitement and disgust faces of the female butions: BIC 772.27). The valence rating for
model in Experiment 1. Each continuum consisted the excitementdisgust continuum and the arousal
of nine photographs of faces, including the two rating for the surprisesadness continuum were
original faces. We defined the facial stimuli using divisible into two clusters.
12.5% increments in the degree of morphing with To assess the occurrence of categorical percep-
respect to surprise or excitement. That is, surprise tion with regard to valence and arousal ratings,
62.5% means that the face consisted of 62.5% we calculated the distance between facial stimuli
surprise elements and 37.5% sadness elements. differing by two steps on each continuum, as in
Experiment 1. Figure 4c shows the mean dis-
Procedure tance of the pairs of facial stimuli within the
The apparatus and presentation setting were surprisesadness continuum and the excitement
identical to those used in Experiment 1. Partici- disgust continuum. For both continua, the dis-
pants were asked to rate the emotions expressed tance between surprise (excitement) 62.5% and
by the faces using the Affect Grid. The timing 37.5% was larger than that between all other pairs
order of the presentation was the same as in for the valence and arousal ratings. That is, a
Experiment 1. Each facial stimulus was presented fifty-fifty face on each continuum was the likely

COGNITION AND EMOTION, 2012, 26 (4) 597


category boundary for the valence and arousal arousal ratings for the excitementdisgust con-
ratings. To confirm whether a category boundary tinuum a single normal distribution yielded a
existed for valence and for arousal, the maximum- better fit than two normal distributions. That is,
distance pairs for the valence and the arousal these continua did not have two clear clusters
ratings were compared with the mean distances of as defined by a normal distribution in the
all the other pairs combined. For the valence valence or arousal ratings. There are some possible
ratings, t-tests showed that the distance between reasons for this finding. First, this pattern of
surprise 62.5% and 37.5% was significantly larger results may have been obtained as a result of the
than was the average distance across other pairs range of the data. For the surprisesadness
on the surprisesadness continuum, t(21) continuum, the arousal ratings were distributed
2.30, pB.05. The excitementdisgust conti- over a wider range of data than were the valence
nuum yielded identical results to those of the ratings, whereas for the excitementdisgust con-
surprisesadness continuum, t(21) 6.69, pB.01. tinuum, the reverse was true. Given these find-
Moreover, we found a significant difference in ings, it is possible that the range of the data with
the distance on the arousal dimension bet- regard to valence for the surprisesadness con-
ween surprise (excitement) 62.5% and 37.5% tinuum and arousal for the excitementdisgust
and the average distances between all other continuum was too narrow to form two normal
pairs, surprisesadness: t(21)4.17, p B.01; distributions. Second, the hallmarks of the cate-
excitementdisgust: t(21)2.98, pB.01. These gorical-perception effect that resulted from ana-
findings indicate that the fifty-fifty faces within lysis using distance on the Affect Grid and GMM
the surprisesadness and the excitementdisgust were fundamentally different. Analysis using the
continua were category boundaries on the arousal Affect Grid ratings tested whether a category
as well as on the valence dimension. boundary existed on the continuum by exploring
the specific pairs of faces that were farthest apart.
On the other hand, GMM assessed whether a
single distribution or two normal distributions
We found distinct boundaries in both the fitted the distribution of the data for the ratings
surprisesadness and the excitementdisgust con- on the Affect Grid. That is, GMM assumed
tinua on the Affect Grid ratings. The results of that all of the data were clusters. Therefore,
the analysis of distance on the Affect Grid if the ratings converge at the centre of the
indicate that the facial stimuli on the continuum distribution of data, corresponding to a cate-
could be divided into two clusters on the valence gory-boundary, a single normal distribution would
dimension, pleasantness and unpleasantness. For fit better than two normal distributions. This may
both continua, the facial expression straddling the explain why the arousal ratings of the excitement
divide was a fifty-fifty face (i.e., surprise 50% or disgust continuum were not divided into two
excitement 50%). Moreover, a category boundary clusters based on GMM. Nevertheless, the fact
was also demonstrated in the arousal ratings for that the facial continuum could be divided by a
both continua. The results indicate that catego- category boundary is evidence for the categorical-
rical perception occurred on the arousal dimension perception effect (Calder et al., 1996; Etcoff &
with regard to the surprisesadness and the Magee, 1992; Young et al., 1997). Hence, our
excitementdisgust continua. Thus, categorical results showing that a boundary existed on the
perception probably contributes to the rating of Affect Grid ratings are sufficient to indicate
facial expressions when using a dimensional that a categorical-perception effect did occur
strategy, not only for valence but also for arousal. regarding arousal for the excitementdisgust
However, although the differences were subtle, continuum and valence for the surprisesadness
the BIC showed that for the valence ratings for continuum. Furthermore, the continua showing
the surprisesadness continuum and for the the widest range on each dimension (i.e., the

598 COGNITION AND EMOTION, 2012, 26 (4)


surprisesadness continuum for arousal and the et al., 2010; Roberson & Davidoff, 2000). In
excitementdisgust continuum for valence in particular, generation of verbal coding in response
Experiments 2) were separated by category to a facial expression arose even when participants
boundaries and exhibited two clusters defined by were not required to label the stimulus (Roberson
normal distributions. This is robust evidence that & Davidoff, 2000). Consistent with this notion, it
categorical perception emerged for the emotional is possible that verbal labelling (e.g., happiness
dimension. or fear) of facial expressions automatically arose
Additionally, facial stimuli were arranged ac- and led to categorical perception, although the
cording to their emotional intensity in terms of Affect Grid rating did not explicitly require
the valence and arousal dimensions, even within participants to label the facial expressions.
categories. Consequently, categorical perception Additionally, there is evidence for the robust-
and dimensional perception co-occurred when ness of the categorical-perception effect within a
facial expressions of emotion were rated on the dimensional strategy. In the current study, all of
Affect Grid. This suggests that a hybrid theory the participants were native Japanese speakers, who
that combines categorical and dimensional ac- are considered to have more varied interpretations
counts is applicable to arousal ratings as well of facial expressions than do Western individuals
as to valence ratings. (Russell, Suzuki, & Ishida, 1993). That is, com-
pared with Western individuals, it is more diffi-
cult for Japanese individuals to attribute one
GENERAL DISCUSSION label to a specific facial expression. Therefore, it
is remarkable that a categorical-perception effect
The current study revealed that category percep- arose for Japanese participants, given that catego-
tion occurs even when using a dimensional rical perception partly relies on one-to-one label-
strategy. A category boundary was found within ling between a face and an emotional word. This
the happinessfear continuum with respect to the finding provides robust evidence for categorical
valence dimension (Experiment 1) and within perception when using a dimensional strategy to
the surprisesadness and excitementdisgust rate facial expressions of emotion.
continuum with regard to both the valence and In summary, the present study produced two
arousal dimensions (Experiment 2). This suggests major findings. First, we found a categorical-
that the mode of perception for emotional facial perception effect even in the context of a dimen-
expressions is not limited by a given response sional strategy, the Affect Grid. Therefore, the
format. Categorical perception is not totally relative dominance of categorical perception and
subject to a categorical strategy (i.e., identification dimensional perception appears not to be totally
or discrimination task). Furthermore, the facial dependent on the response format. To obtain a
stimuli were arranged in sequence according to category boundary that straddles two emotions in a
their physical changes even within a category. dimensional strategy, a facial continuum stimulus
This indicates that people can simultaneously use would need to encompass emotional dimensions
both categorical perception and dimensional per- such as the excitementdisgust continuum for
ception, suggesting a hybrid theory of emotion valence and the surprisesadness continuum for
(Christie & Friedman, 2004; Panayiotou, 2008; arousal. Second, we found that categorical percep-
Russell, 2003). tion and dimensional perception co-occurred in
One possible explanation for the categorical- ratings of facial expressions. The continuum was
perception effect within a dimensional strategy is divided into two categories with respect to valence
the role that language plays when rating facial and arousal, but each of the morphed images was
expressions. Previous research has suggested that arranged in the correct order, corresponding
verbalisation significantly contributes to the category- to their physical changes. This finding supports
perception effect (Roberson et al., 2007, Roberson a hybrid theory of categorical perception and

COGNITION AND EMOTION, 2012, 26 (4) 599


dimensional perception. In sum, categorical and Katsikitis, M. (1997). The classification of facial
dimensional accounts of emotion are not funda- expressions of emotion: A multidimensional-scaling
mentally contradictory but might be complemen- approach. Perception, 26, 613626.
McLachlan, G. J., & Basford, K. E. (1988). Mixture
tary. Future research should investigate the
models: Inference and applications to clustering. New
interaction between categorical perception and York, NY: Marcel Dekker.
dimensional perception using various methodolo- Ogawa, T., Fujimura, T., & Suzuki, N. (2005).
gies to assess emotional responses and experiences. Perception of facial expressions of emotion under
brief exposure duration. Japanese Journal of Research
Manuscript received 11 October 2010 on Emotions, 12, 111.
Revised manuscript received 22 May 2011 Panayiotou, G. (2008). Emotional dimensions reflected
Manuscript accepted 25 May 2011 in ratings of affective scripts. Personality and In-
First published online 8 August 2011 dividual Differences, 44, 17951806.
Roberson, D., Damjanovic, L., & Kikutani, M. (2010).
Show and tell: The role of language in categorizing
facial expressions of emotion. Emotion Review, 2,
Calder, A. J., Young, A. W., Perrett, D. I., Etcoff, Roberson, D., Damjanovic, L., & Pilling, M. (2007).
N. L., & Rowland, D. (1996). Categorical percep- Categorical perception of facial expressions: Evi-
tion of morphed facial expression. Visual Cognition, dence for a category adjustment model. Memory &
3, 81117. Cognition, 35, 18141829.
Calvo, M. G., & Marrero, H. (2009). Visual search of Roberson, D., & Davidoff, J. (2000). The categorical
emotional faces: The role of affective content and perception of colors and facial expressions: The
featural distinctiveness. Cognition and Emotion, 23, effect of verbal interference. Memory & Cognition,
782806. 28, 977986.
Christie, I. C., & Friedman, B. H. (2004). Autonomic Russell, J. A. (1980). A circumplex model of affect.
specificity of discrete emotion and dimensions of Journal of Personality and Social Psychology, 39, 1161
affective space: A multivariate approach. Interna- 1178.
tional Journal of Psychophysiology, 51, 143153. Russell, J. A. (1997). Reading emotions from and into
Ekman, P. (1992). Argument for basic emotions. faces. In J. A. Russell & J. M. Fernandez-Dols
Cognition and Emotion, 6, 169200. (Eds.), The psychology of facial expression (pp. 295
Ekman, P. (1994). Strong evidence for universals in 320). Paris, France: Cambridge University Press.
facial expressions: A replay to Russells mistaken Russell, J. A. (2003). Core affect and the psychological
critique. Psychological Bulletin, 115, 268287. construction of emotion. Psychological Review, 110,
Ekman, P., & Friesen, W. V. (1971). Constants across 145172.
cultures in the face and emotion. Journal of Person- Russell, J. A., & Bullock, M. (1985). Multidimensional
ality and Social Psychology, 17, 124129. scaling of emotional facial expressions: Similarity
Ekman, P., & Friesen, W. V. (1976). Pictures of facial from preschoolers to adults. Journal of Personality
affect. Palo Alto, CA: Consulting Psychologists and Social Psychology, 48, 12901298.
Press. Russell, J. A., Suzuki, N., & Ishida, N. (1993).
Ekman, P., Sorenson, E. R., & Friesen, W. V. (1969). Canadian, Greek, and Japanese freely produced
Pan-cultural elements in facial displays of emotions. emotion labels for facial expressions. Motivation
Science, 164, 8688. and Emotion, 17, 337351.
Etcoff, N. L., & Magee, J. J. (1992). Categorical Russell, J. A., Weiss, A., & Mendelsohn, G. A. (1989).
perception of facial expressions. Cognition, 44, Affect Grid: A single-item scale of pleasure and
227240. arousal. Journal of Personality and Social Psychology,
Harnad, S. (1987). Categorical perception. Cambridge, 57, 493502.
UK: Cambridge University Press. Schlosberg, H. (1954). The dimensions of emotions.
Johnson-Laird, P. N., & Oatley, K. (1992). Basic Psychological Review, 61, 8188.
emotions, rationality, and folk theory. Cognition Takehara, T., & Suzuki, N. (1997). Morphed images of
and Emotion, 6, 201223. basic emotional expressions: Ratings on Russells

600 COGNITION AND EMOTION, 2012, 26 (4)


bipolar field. Perceptual and Motor Skills, 85, 1003 Young, A. W., Rowland, D., Calder, A. J., Etcoff, N.
1010. L., Seth, A., & Perrett, D. I. (1997). Facial
Takehara, T., & Suzuki, N. (2001). Differential expression megamix: Tests of dimensional and
processes of emotion space over time. North Amer- category accounts of emotion recognition. Cognition,
ican Journal of Psychology, 3, 217228. 63, 271313.
Tomkins, S. S., & McCarter, R. (1964). What and
where are the primary affect? Some evidence for a
theory. Perceptual and Motor Skills, 18, 119158.

COGNITION AND EMOTION, 2012, 26 (4) 601