Você está na página 1de 21

Affective Computing and Autism

RANA EL KALIOUBY,a ROSALIND PICARD,a


AND SIMON BARON-COHENb
a Massachusetts Institute of Technology, Cambridge, Massachusetts 02142-1308,
USA
b University of Cambridge, Cambridge CB3 0FD, United Kingdom

ABSTRACT: This article highlights the overlapping and converging goals


and challenges of autism research and affective computing. We propose
that a collaboration between autism research and affective computing
could lead to several mutually beneficial outcomes—from developing
new tools to assist people with autism in understanding and operating
in the socioemotional world around them, to developing new computa-
tional models and theories that will enable technology to be modified to
provide an overall better socioemotional experience to all people who use
it. This article describes work toward this convergence at the MIT Media
Lab, and anticipates new research that might arise from the interaction
between research into autism, technology, and human socioemotional
intelligence.

KEYWORDS: autism; Asperger syndrome (AS); affective computing; af-


fective sensors; mindreading software

AFFECTIVE COMPUTING AND AUTISM

Autism is a set of neurodevelopmental conditions characterized by social


interaction and communication difficulties, as well as unusually narrow, repeti-
tive interests (American Psychiatric Association 1994). Autism spectrum con-
ditions (ASC) comprise at least four subgroups: high-, medium-, and low-
functioning autism (Kanner 1943) and Asperger syndrome (AS) (Asperger
1991; Frith 1991). Individuals with AS have average or above average IQ and
no language delay. In the other three autism subgroups there is invariably some
degree of language delay, and the level of functioning is indexed by overall
IQ. Individuals diagnosed on the autistic spectrum often exhibit a “triad of
strengths”: good attention to detail, deep, narrow interest, and islets of ability
(Baron-Cohen 2004). In this article we consider how such strengths could be
harnessed through the use of technologies to navigate the social world.

Address for correspondence: Rosalind W. Picard, Sc.D., FIEEE, MIT Media Laboratory, E15-448,
20 Ames Street, Cambridge, MA 02142-1308. Voice: 617-253-0611; fax: 617-253-5922.
e-mail: picard@media.mit.edu

Ann. N.Y. Acad. Sci. 1093: 228–248 (2006). 


C 2006 New York Academy of Sciences.

doi: 10.1196/annals.1382.016

228
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 229

Autism remains a behaviorally specified condition, the diagnosis relying on


interviews and/or direct observations (LeCouteur et al. 1989; Lord et al. 1989,
1994, 2000). The diagnosis criteria include a “marked impairment in the use of
nonverbal behaviors, such as eye-to-eye gaze, facial expression, body posture,
and gestures to regulate social interaction,” and rely on the clinician’s judgment
about the individual’s ability to engage in social interactions, process social
information and deal with social anxiety. Interventions, too, are mostly behav-
ioral and are aimed at addressing the social interaction and communication
difficulties in autism.
One of the central psychological themes in autism research is that of em-
pathizing. Often characterized as the ability to “put oneself into another’s
shoes,” empathizing is the capacity to attribute mental states, such as feel-
ings, thoughts, and intentions to other people, and to respond to their mental
states with an appropriate emotion (Mehrabian and Epstein 1972; Spiro 1993;
Omdahl 1995; Eisenberg 2000; Harris 2003; Baron-Cohen and Wheelwright
2004). Empathy is a set of cognitive and affective skills we use to make sense
of and navigate the social world (Davis 1983). The cognitive component of
empathy, also referred to as theory of mind (Wellman 1992), mindreading
(Whiten 1991; Baron-Cohen 1995), or taking the intentional stance (Dennett
1987), involves setting aside one’s own current perspective, attributing mental
states to the other person, and then making sense and predicting that person’s
behavior, given his or her experience. Mental states include emotions, cogni-
tive states (such as beliefs), volitional states (such as intentions and desires),
perceptual states (such as seeing or hearing), and attentional states (such as
what the person is interested in). The affective component entails having an
emotional response to the mental state of others. To be an empathic observer,
your feeling must be appropriate to that of the person observed, for instance
feeling compassion to another’s distress.
Good empathizers also have good “people intuition” (sometimes known as
folk psychology or common sense psychology). People intuition is the set of
assumptions we make about the relationships between people’s behavior, men-
tal states, and situation (Wellman 1992). It is the basis for our social judgments
about others, including the production and comprehension of pretence (Leslie
1987; Pratt and Bryant 1990), understanding that seeing-leads-to-knowing
(Pratt and Bryant 1990), making the appearance-reality distinction, and un-
derstanding false belief (Wimmer and Perner 1983). When we empathize, we
respond in ways that acknowledge feelings of others and we are sensitive to
other’s different beliefs and perspectives. In addition, empathizing allows us to
share perceptual space with others, which is crucial for social learning, joint
action, and joint attention (Baron-Cohen 1995). To make sense of a social sit-
uation, most people will naturally follow others’ gaze direction. When people
focus on nonsocial stimuli (e.g., background objects), as is often the case in
autism, they may miss the gist in the social interaction (Klin et al. 2002, 2003).
Despite their interest in making friends, many individuals with autism re-
port having difficulties empathizing in a spontaneous way during real-time
230 ANNALS NEW YORK ACADEMY OF SCIENCES

social interaction and lacking people intuition. These difficulties vary with the
severity of the condition, and include difficulty reading other peoples’ non-
verbal cues and mental states (Joseph and Tager-Flusberg 1997; Frith 2003),
atypical gaze processing (Volkmar and Mayes 1991; Klin et al. 2002; Pelphrey
et al. 2005), restricted emotional expression (Hill et al. 2004), difficulties
gauging the interests of others in conversation (Fletcher et al. 1995; Volkmar
and Klin 2000), and frequently launching into monologues about narrowly de-
fined and often highly technical interests, such as railway tables or maps (Klin
and Volkmar 1995).
Over the past 10 years, researchers in affective computing (Picard 1997) have
begun to develop technologies that advance our understanding of or approach
to affective neuroscience and autism. Affective computing has contributed to
these fields in at least 4 ways: (i) designing novel sensors and machine learning
algorithms that analyze multimodal channels of affective information, such as
facial expressions, gaze, tone of voice, gestures, and physiology; (ii) creating
new techniques to infer a person’s affective or cognitive state (e.g., confu-
sion, frustration, stress, interest, and boredom); (iii) developing machines that
respond affectively and adaptively to a person’s state; and (iv) inventing per-
sonal technologies for improving awareness of affective states and its selective
communication to others.
While much of the work in affective computing has been motivated by the
goal of giving future robots and computational agents socioemotional skills,
its researchers have also recognized that they face similar challenges to those
who try to help people with autism improve such skills. Computers, like most
people with autism, do not naturally have the ability to interpret socioaffective
cues, such as tone of voice or facial expression. Similarly, computers do not
naturally have common sense about people and the way they operate. When
people or machines fail to perceive, understand, and act upon socioemotional
cues, they are hindered in their ability to decide when to approach someone,
when to interrupt, or when to wind down an interaction, reducing their ability
to interact with others. A large part of natural learning involves reading and
responding to socioemotional cues, so this deficit also interferes with the ability
to learn from others. The field of affective computing aims to change the nature
of technology so that it can sense, respond, and communicate this information.
In so doing, the field has a lot to learn from people with autism, from progress
they have made, and from the friends, families, and staff who work with these
individuals. We should point out that we are not using autism as a metaphor,
unlike the postautistic economics network (Post-Autistic Economics Network
2000) or Wegner’s (1997) description of autistic algorithms. Our use of autism
is restricted to the clinical definition.

A SYSTEMATIC APPROACH TO EMPATHY


So what do you do if, as in the cases of both autism and technology,
empathizing is not something you naturally apply to the social world? You
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 231

systemize. Systemizing is the drive to analyze and build systems and is


one of the most powerful mechanisms to understand systems and predict
change (Baron-Cohen, 2002). Systemizing involves sensing, pattern recog-
nition, learning, inference, generalization, and prediction.
Persons diagnosed with ASC are extreme systemizers, showing intact or
superior systemizing abilities, such as excellent attention to detail, islets of
ability in topics like prime numbers, calendrical calculation, or classification
of artifacts or natural kinds (Shah and Frith 1983; Jolliffe and Baron-Cohen
1997; Baron-Cohen et al. 2002; Baron-Cohen 2006). Many people with ASC
attempt to systemize empathy, analyzing conversations and interactions, as it
unfolds and for hours after it is over (Blackburn et al. 2000; Mixing Memory
Blog 2005). For many, this is a tiring and draining exercise that makes it difficult
to react in real time. As one person with an ASC put it:
I study people almost to the point of obsession. I find some people’s ac-
tions/motivations etc. extremely intriguing. Some people puzzle me. Often
after I’ve had a conversation with someone I cannot sleep at night because
I am analyzing the conversation. I rerun the whole thing, look at what went
wrong and what didn’t, work out what might have actually been meant by
that, think about more accurate answers, etc. I also plan conversations ahead
of time if I know I am going to have to talk to someone. In fact, conversa-
tions/social interactions all seem like a strategy game to me. You have to plan
your moves in advance, work out all the possible ways the opponent might re-
spond, and try and work out different courses of action for each of these. The
only problem is, often in real time and life, the other person makes a move you
haven’t accounted for, resulting in the end of any conversation. Thus, while I
spend vast amounts of time analyzing social situations, the practical side of
things is still highly stressful and very hard to do successfully (Blackburn et
al. 2000).
These first-hand accounts from people with autism stress that coping strate-
gies are indeed needed for autism and suggest that systematic approaches to
teaching empathy might be helpful. For example, recent interventions in autism
providing computer-based methods of teaching emotion recognition have lead
to an improvement in recognizing emotion (Golan et al. 2006). These ac-
counts also illuminate the complexity of the social world and the challenges
inherent in crafting a real-time intelligent response to high-speed complex and
unpredictable information. There are no computational technologies that can
perform feats of real-time socioemotional interaction. Today’s most powerful
robots perform much worse than people with autism at these challenges. Not
only do they have difficulty understanding natural language, but they miss
most of the socioaffective cues that can be used to decode the message. Robots
also miss most of the facial and gestural cues, and cannot infer how these
interact with what is said.
People with ASC who can systemize information about social situations can
help researchers who are trying to build affective robots and agents, and future
socioemotional technologies. For example, individuals with ASC tend to have a
literal interpretation of what people say to them (Baron-Cohen 1988; Attwood
232 ANNALS NEW YORK ACADEMY OF SCIENCES

1998). Jonathan Bishop has developed a portable digital assistant (PDA) to


help people with autism interpret frequently used idioms (Bishop 2003). For
example, the system may explain that a comment, such as “Nice weather
today,” is not a statement of fact, which would be the literal interpretation,
but an invitation to engage in casual conversation. The same technology can
improve natural language processing in machines, helping them know how to
better interpret what people say to them. These technologies need systematic
ways to represent and handle social interactions, and people with autism have
a unique ability to show researchers how to do this.
Systemizing empathy is an enormously challenging endeavor. For a start,
the social world is a highly complex system of enormous variance (Baron-
Cohen 2006). To date, there is no “code-book” available that maps a person’s
observed nonverbal cues to internal state and behavior. People with the same
mental state may express these using different nonverbal cues, with varying
intensities and durations. And people may use expressions that reflect men-
tal states that are different from their true feelings or thoughts. Furthermore,
when placed in the same situations, people may react differently. Empathizing
is a highly uncertain process. We are never 100% sure of a person’s mental
state; instead, we infer mental states from observable behavior and contextual
cues with varying degrees of certainty (Baron-Cohen 2003), and our aver-
age performance is probably far from 100%. For example, when shown face-
videos, a panel of 18 people were only 54% accurate on labeling six categories
of mental states, such as agreement, thinking, and confusion (el Kaliouby
2005).
Systemizing empathy is also challenging because affect is hard to measure
(Picard et al. 2004). There are no continuous sensor systems that can reli-
ably measure affective state. Without reliable sensors, how can we quantify
exactly normal eye contact? There is a need to develop sensors, interfaces, sig-
nal processing, pattern recognition, and reasoning algorithms, for continuous
tracking of a person’s affective interactions. These technologies will be key in
assessing an individual’s specific areas of strengths and weaknesses, as well
as measuring the efficacy of various interventions.
Affective computing over the past 10 years has been developing sensing and
recognition technologies that, together with insights from people with ASC,
may eventually facilitate systemizing the social world. In the next section we
summarize several of the recent innovations that enable technology to sense
affective states, and that have an obvious potential application to provide people
with ASC with a direct “print out” of other’s mental states. One of the biggest
obstacles to empathy and mindreading that people with ASC report is that they
cannot easily detect and read another’s mental states—that they are largely
unobservable. Affective computing highlights how such internal states are not
wholly unobservable, that there are indicators that can make mental states more
transparent or magnified, and that if these can be detected by technology, the
human observer can make use of them.
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 233

AFFECT SENSING

The ability to sense a person’s affective-cognitive state is the first step in


mindreading. Despite many advances in brain imaging, there is not yet any
technology that can read your innermost thoughts and feelings and communi-
cate this to another. However, there are a growing number of portable sensors
that can capture various physical manifestations of affect. These novel sen-
sors are like perceptual mechanisms. Examples include tiny video camcorders
to record facial expression, head gesture and posture changes, microphones
to record vocal inflection changes, skin-surface sensing of muscle tension,
heart-rate variability, skin conductivity, blood-glucose levels, and other bodily
changes. Although they started off as bulky, affective “wearables” are now
embedded in jewelry or woven into clothing (Picard and Healey 1997). Affec-
tive wearables are different to existing medical devices that measure similar
signals, in that the wearer is in control. The wearer can take it off, or turn it
off, or leave it, choosing when and where to gather information. The analogy
is with a hearing aid or a pair of spectacles.
We have developed several kinds of systems for communicating affective
information at the MIT Media Laboratory. FIGURE 1 shows portable forms of
affective sensors, from left to right: expression glasses, blood-volume pulse
earring, pressure-sensitive mouse, galvactivator, chest-worn heart monitor, and
skin-conductance shoe. Here are some brief highlights about some of the wear-
able or portable devices:

(i) Expression glasses discriminate facial expressions of interest or surprise


from those of confusion or dissatisfaction, allowing students to commu-
nicate feedback anonymously to the teacher in real time, without having
to shift attention from trying to understand the teacher (Scheirer et al.
1999).
(ii) The Galvactivator is a skin-conductance sensing glove that converts level
of skin conductance to the brightness of a glowing LED (Picard and
Scheirer 2001). Skin conductance increases with many kinds of auto-
nomic arousal, especially ramping up with novelty, significance, and
stress. Hirstein’s team (Hirstein et al. 2001) highlighted differences in
skin conductance patterns among many people with autism. This glove
can help wearers reflect on their personal response patterns.

FIGURE 1. Portable forms of affective sensors.


234 ANNALS NEW YORK ACADEMY OF SCIENCES

(iii) StartleCam is a wearable camera system that saves video based on a


physiological response, such as your skin conductance arousal response,
tagging the data with information about whether or not it was exciting to
you (Healey and Picard 1998).
(iv) Affective state can also be inferred from the way we interact with and
manipulate objects. An increase in physical pressure applied to a pressure-
sensitive mouse has been shown to be associated with frustration, caused
by poor usability in a computer interface (Qi et al. 2001; Qi and Picard
2002; Dennerlein et al. 2003).
(v) Location, proximity, audio and motion sensors signal socioaffective dis-
plays, such as dominance, excitement, and nonaggression, across popu-
lations (Pentland 2006).
More recent wearables that are made by industry include:
(i) The SenseWear Pro2 armband measures changes in energy expended,
energy balance and weight loss using a heat-flux sensor, skin tempera-
ture, galvanic skin response, an accelerometer, and an electrocardiogram
(ECG) sensor attached on the upper arm. Even though the armband is not
specifically being marketed as an affective technology, it could be modi-
fied to detect and communicate affect variables to the wearer and to others,
for example, monitoring and analyzing patterns of stress, frustration, and
productivity.
(ii) EmSense Corporation is developing small, wearable sensors including
dry ECGs that create a model of a user’s emotions.
(iii) Fraunhofer has developed a research prototype glove that senses heartbeat,
breathing rate, blood pressure, skin temperature and conductance, and an
ECG shirt using conductive yarn and flexible electronics.
(iv) Goodwin’s team used a wireless heart rate monitor (LifeShirt, Vivo-
metrics, Inc., Ventura, CA) to monitor the cardiac responses of low-
functioning persons with autism under repeated conditions of environ-
mental stressors (Goodwin et al. 2006).
The above technologies are all at varying degrees of development and avail-
ability. But we can also speculate about other possible technologies that could
become available, such as swallowable pills or implantable sensors, that an-
alyze bodily fluids for hormones and neurotransmitter levels. For example,
levels of dopamine act on voluntary movement and emotional arousal, pro-
ducing effects, such as an increased heart rate and blood pressure; serotonin
affects sleep and temperature. Drawing on research that links food, affect, and
cognition to circadian rhythm (Wurtman and Danbrot 1988), affect-sensing
swallowable pills would measure hormone and neurotransmitter levels and
then send the measurements wirelessly to on-body portable devices. While
neurotransmitter levels are currently not easy to sense without drawing saliva
or blood or using other invasive procedures, and the data is not made available
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 235

wirelessly, new implantable and swallowable sensors are already in progress,


exploiting nanoscale technology.
As with any wearable system however, there are design issues. Wearable sys-
tems are certainly becoming smaller, but there are still issues with the number
of wires flowing in and out of the sensors and with sensors slipping off. Battery
power is a challenging issue too. Finally, sensors-on-the-go need to be robust
to noise that arises from activity unrelated to the signal being measured. Heart
rate, for example, can increase significantly with physical exertion or with
sneezing, as well as with anger. Addressing these challenges is a prerequisite
for the adoption of these wearable systems in everyday applications.

AFFECT RECOGNITION, LEARNING, AND GENERALIZATION

In her doctoral dissertation, Rana el Kaliouby (2005) developed a compu-


tational model of mindreading as a framework for machine perception and
mental state recognition. This framework combines bottom-up vision-based
processing of the face (e.g., a head nod or smile) with top-down predictions
of mental state models (e.g., interest and confusion) to interpret the mean-
ing underlying head and facial signals over time. The framework comprises
multilevel, probabilistic architecture which mimics the hierarchical way with
which people perceive facial and other human behavior (Zacks et al. 2001)
and handles the uncertainty inherent in mindreading. The output probabilities
represent a rich modality analogous to the information humans receive in ev-
eryday interaction through mindreading. FIGURE 2 shows the real-time output
of the mindreading software showing different granularity of head gesture and
facial analysis. The horizontal bars show the probability of various head ges-
tures and facial expressions. The bottom line graphs show the probabilities of
the mental states; the radial chart summarizes an interaction, showing the most
likely mental states over time.
This model allows also multiple asynchronous sensors to be combined.
One view of autism, the “weak central coherence” theory, emphasizes the
importance of sensory integration. The theory contends that people with ASC
process information at the local (rather than the Gestalt) level, often failing to
integrate multiple sources of sensory information (Happé 1966; Frith 2003).
If the primary deficit in autism is indeed one of integrating information at
a global level, it is easy to imagine how theory of mind or empathy would
suffer. This is because different affective modalities often complement each
other, or substitute for each other when only partial input is available, or may
contradict one another as in deception. Thus, compared with unimodal systems
that assume a one-to-one mapping between an affective state and a modality,
multimodal systems yield a more faithful representation of the relationship
between mental states and external behavior. Our MIT group continues to
develop novel approaches to combine multiple modalities, such as face and
236 ANNALS NEW YORK ACADEMY OF SCIENCES

FIGURE 2. Real-time output of the mindreading software.

posture, to infer affective states, such the level of engagement (interest versus
boredom), of learners (Kapoor et al. 2004, 2005).
Another challenge is generalization. Generalization is the capacity to apply
knowledge from one context to new contexts. In autism, it is uncertain whether
with existing interventions, individuals are able to successfully transfer the
knowledge they acquire. Computers also have problems generalizing from the
examples they were trained on, to the analysis of new unseen information.
The field of machine learning is perpetually trying to improve the ability
of computers to generalize. With the advent of more robots and agents that
will interact with people, there is increased interest in enabling machines to
learn better by learning from people in natural colearning situations, not just
from people who “program” the computer or robot (Breazeal 2002). But such
learning again requires socioemotional skills, such as the ability to see if the
person teaching you is shaking their head and frowning.
As part of what we call the socioemotional intelligence prosthesis, we are
exploring new kinds of systems that learn with people through natural interac-
tion (el Kaliouby and Robinson 2005; el Kaliouby et al. 2006). The intelligent
system we aim to build is a colearner with the person with ASC in trying to
learn how to recognize and respond to socioemotional cues (Picard et al. 2004).
One possibility for such a system is to exploit this common learning goal and
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 237

perhaps play games with the individual, to assist him or her with continuously
learning to generalize, occasionally bringing in non-ASC experts for corrective
feedback and validation. The non-ASC person(s) could be present physically,
or remotely connecting in through the technology, to help with the learning
process. Social or emotion tagging, a situation where the parents and/or care-
givers of a child with autism accompany the child and “tag” events with social
labels, is a promising approach albeit expensive and impractical. Through the
use of a head-mounted wearable camera/microphone, parents and caregivers
could “eyejack” the child’s visual field and tag the world remotely. This is both
practical and cost-effective; it allows the child to be more independent, while
continuing to enable parents and caregivers to share experiences with the child
and help with learning.

TECHNOLOGIES THAT ENHANCE EMPATHIZING

Many persons with ASC prefer to communicate with and through computers
because they are predictable and place some control on the otherwise chaotic
social world (Moore et al. 2000). How can we harness their interest in technol-
ogy to systemize the social world? For young children and those at the lower
end of the autism spectrum, sociable robotics and dolls are a good approach
to helping social interaction skills. The use of robots allows for a simplified,
predictable, and reliable environment where the complexity of interaction can
be controlled and gradually increased. It is also more realistic and engaging
than interacting with a screen. The Affective Social Quotient project is one
of the early projects at MIT Media Lab to develop assistive technologies for
autism using physical input devices, namely four dolls (stuffed dwarfs), which
appeared to be happy, angry, sad, or surprised (Blocher and Picard 2002). The
system would play short digital videos that embody one of the four emotions,
and then encourage the child to choose the dwarf that went with the appropri-
ate emotion. When the child picked up the stuffed toy, the system identified
its infrared signal and responded. Use of the dolls as physical input devices
also encouraged development of joint attention and turn-taking skills, because
typically another person was present during the session. Other robot platforms
have been used for autism intervention, encouraging social behavior, such as
turn-taking and shared attention (Dautenhahn et al. 2002; Scassellati 2005).
Robotics may also be useful for individuals at the higher end of the autism
spectrum, who would need help with the subtle, real-time social interactions.
One can imagine a variation of LEGO—already known to be helpful as an
intervention in autism (LeGoff 2004)—that combines rules and mechanics to
allow for social explorations. Robotics could also be used by groups of chil-
dren for improvisation, and directing play, encouraging turn-taking between
children.
Affect sensing and affect recognition are technologies that are readily appli-
cable to autism interventions. Affect sensing and recognition technologies can
238 ANNALS NEW YORK ACADEMY OF SCIENCES

FIGURE 3. The self-cam chest-mounted video camera.

help increase self-awareness, and provide novel ways for self-monitoring. One
of the first problems we encountered when having a person wear a camera with
software to interpret the facial expressions of a conversational partner was that
the person with ASD might not even look at the face of the other person. Thus,
the wearable camera might point at the floor or at a shirt pocket, or a nearby
object instead of at the face that needs to be read. One possible solution is
via a device, such as the eye contact sensing glasses (Vertegaal et al. 2001),
wearable glasses that recognize when a user is in eye contact with another
person. These glasses can be used to measure the magnitude and dynamics
of eye contact in people with ASC. These patterns can then be compared to
eye contact in people without ASC. It can also be used as an intervention to
encourage people with ASC to pay more attention to the face.
In some cases there are privacy concerns with wearing a camera that records
those around you. Out of such concerns, Alea Teeters in our lab developed
the self-cam shown in FIGURE 3 (Teeters et al. 2006). Self-Cam is a small,
lightweight video camera that is worn over the chest and points at one’s face.
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 239

The camera is connected to a small portable handtop that is belt-mounted on


the hip, and has a built-in microphone for recording audio. Facial expression
analysis software on the handtop identifies head and facial gestures (e.g., head
nod and smile) and mental states (e.g., agreement, confusion and interest).
In the real-time mode, the camera tracks and analyses the mental state of
its wearer in real time, and communicates these mental state inferences to
the wearer visually or via audio clips and/or tactile vibration. Self-Cam only
records the face of the wearer, which solves privacy problems related to filming
other people in the environment without their consent. If more than one person
is wearing the Self-Cam, this information can be exchanged between different
wearers. Thus, Self-Cam is a fun way for people to explore social situations that
are relevant to them (e.g., the faces of family and friends) without accidentally
recording data from people who do not want to be recorded. It also avoids
the problems mentioned above of having to make sure the camera sees a face,
because self-cam is almost always pointed directly at the wearer’s face.
In a forthcoming collaboration between MIT Media Lab and the Groden
Center—a school and intervention center for autism based in Providence,
Rhode Island—we will evaluate the scientific and clinical significance of using
Self-Cam to improve the recognition of emotions from faces in young adults
with AS and high-functioning autism. In the study, the person with ASC as well
as the interaction partner (a teacher) will each wear a Self-Cam, so that neither
participant feels singled out. Wearers will review the videos recorded during
the sessions and interact with the computerized facial analysis software at their
own pace. This will enable the wearer to associate emotion and mental state
labels, and to review specific social situations in slow motion (because people
with ASC often report that nonverbal cues, such as facial expression, are too
subtle and quick to discern in real time). Parents, teachers, and clinicians may
play the video back for the wearer and provide feedback (e.g., help identify
facial expressions and pair them with emotion labels) and reinforcement. With
such a system we can also explore the question of whether looking at one’s
own facial movements (while of course knowing what one is personally feel-
ing) will enhance interest in looking at faces in general. If people with ASC
can start to associate their own facial expressions with their own feelings, this
might enhance their natural interest in other people’s faces.

MOVING UP THE AUTISM SPECTRUM

According to the E-S theory of sex differences, there are at least three types
of brain, derived from two orthogonal dimensions—empathizing and system-
izing (Baron-Cohen 2002), diagrammed in FIGURE 4 (numbers are standard
deviations from the mean). The first is characterized by systemizing being
stronger than empathizing, a profile more common in males. The second type
has the profile of systemizing and empathizing being balanced. The third
240 ANNALS NEW YORK ACADEMY OF SCIENCES

FIGURE 4. The main brain types.

involves empathizing being stronger than systemizing, a profile more com-


mon in females. Autism appears to correspond to an extreme of the male brain,
with systemizing being intact or above average, alongside empathizing being
impaired (Baron-Cohen 2006). One of the interesting aspects of this theory is
that the brain types are continuous, blending seamlessly with normality. That
is, we are all situated somewhere on the same continuum, and one’s position on
the continuum reflects a different cognitive style and inclination toward sys-
temizing or empathizing. An important implication of this dimensional model
is that the line between ability and disability is blurred. This view of autism as
a different kind of mind is shared among an increasing number of individuals
with autism and families (DANDA 2003), and is dubbed the neurodiversity
model of autism. 1
One’s empathizing skills while in part genetically predisposed (Skuse 1997)
and in part influenced by prenatal testosterone (Chapman et al. in press) are
not fixed. Empathizing may vary as a function of the person’s early experience
(Bowlby 1982), current affective state, and the surrounding context. One factor
that may improve or impede our ability to empathize is stress and anxiety.
We have explored several approaches to measuring stress using physiological
sensors, heart rate, pedometer, accelerometer, context beacons, and location
(Picard and Du 2002; Healey and Picard 2005); other physical symptoms
include blood pressure, muscle tension, and sleep problems. Affective state
may also affect the ability to empathize. For instance, when angry, one’s current
emotional state might cloud the ability to see another person’s perspective.
1 http://www.neurodiversity.com.
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 241

Similarly, a person who is preoccupied may fail to notice the nonverbal cues
of others, misreading their mental state.
The technologies that improve empathizing in autism should in theory also
contribute to improvements in these skills in the general population. As Mal-
colm Gladwell shows in Blink, a person’s knowledge base of people intuition
can be broadened, affecting one’s ability to make accurate snap judgments
(Gladwell 2003). Technology may augment people’s capacity to empathize and
improve their people intuition (whether or not they are diagnosed with autism)
in at least three ways: increased self-awareness, improved communication with
others, and better social learning. For instance, a wearable system that contin-
uously measures stress or anxiety signals can help the wearer regulate arousal,
raising self-awareness and encouraging people to switch perspectives under
conditions of high arousal. Another application is a personal anger manage-
ment wearable system that would detect states, such as anger, and attempt to
calm the wearer, perhaps even through empathizing verbally with the wearer
(Klein et al. 2002). Technologies that sense various aspects of the person’s
affective and physiological state can be used for self-monitoring. Making this
knowledge available in a simplified and easy to visualize manner is a good
motivational factor to change habits. (It has been shown for instance, that daily
self-weighing is a strong motivational factor for losing weight).
This information about oneself could also be selectively communicated to
others to enhance group communication. An example is the Communicator
system (Rubin et al. 2003) that uses a combination of nano/info technologies
that allow individuals to carry with them electronically stored information
about themselves, such as interests, background, and affective state, that can
be broadcasted as needed in group situations. Participants would have the
ability to define or restrict the kinds of information about themselves that they
would be willing to share with other members of the group.

ETHICAL CONSIDERATIONS

Along with the potential benefits, we recognize that there are important
ethical considerations that arise with the development of these technologies.
An exhaustive discussion of these ethical considerations is beyond the scope
of this article; instead, we will suffice with a few examples that highlight the
importance of being sensitive to the needs of the end-users of this technology,
be they autism researchers and practitioners, or individuals diagnosed with
autism and their families.
Besides the privacy issues of sensing and broadcasting affective state in-
formation (Reynolds and Picard 2004), one issue to consider is whether in-
dividuals with autism need treatment or technology “fixes” at all. We agree
that ASC involve a different cognitive style, allowing many individuals with
autism to focus deeply on a given subject, which can lead to original thought.
242 ANNALS NEW YORK ACADEMY OF SCIENCES

We thus prefer to design technologies that do not try to “fix” people, but rather
that can be used by individuals to augment or further develop their natural
abilities. If these new technologies hold the promise of improving empathy,
this should only be undertaken with the individual’s consent, where it is pos-
sible to obtain it, or with their parent’s consent in the case of a young child.
Unlike medical interventions where there is a risk of unwanted side effects,
affective computing based interventions may have highly specific effects (on
empathy) while leaving other domains (e.g., systemizing) unaffected. We adopt
a user-centered design and development approach to ensure that individuals
with autism and their caregivers are involved in the development phases of
intervention technologies that they need the most.
Another ethical consideration is whether exposing affective state informa-
tion creates opportunities for others to manipulate one’s behavior and thoughts
using this information (see Reynolds (2005) for examples). Even in situations
where the use of technology is honest, there are still potential concerns. If
an individual with autism wears an assistive system that senses the affective
state of others, then this would raise the expectations of interaction partners,
increasing (rather than decreasing) the social pressures on the person with
autism to respond to these cues in real time. Such a system might be more
burdensome than helpful. It is essential that researchers address these con-
siderations and explore the potential opportunities brought by a convergence
in autism research and affective computing with open-mindedness about the
possible successes or failures of such an approach.

CONCLUSION

This article highlights several opportunities for convergence between affec-


tive computing and autism, and presents some progress in that direction. The
Department of Health and Human Services has called for new approaches that
improve real-world functioning of individuals with autism, throughout their
school-age years and beyond (Department of Health and Human Services
2004). The Cure Autism Now’s Innovative Technology for Autism Initiative,
intended to create a merger of technology with other fields, is yielding an
interdisciplinary approach to the challenge of utilizing technology to improve
the lives of people with autism (Cure Autism Now 2006). Industry funding
too is on the rise. For instance, Motorola funded the “mood phone,” a phone
designed to interpret the mood of the person on the other end of the line, that
is meant to help people with AS who are unable to recognize emotional cues
in the speech of others.
In summary, this article presents affect sensing and recognition as core tech-
nologies, and describes their application as assistive and learning devices for
individuals with autism. The opportunities for benefit are two way: helping peo-
ple with autism, and helping technologies to be smarter about socioemotional
interaction. People with autism, especially those who have developed solutions
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 243

to systematize and understand social interaction can help technologists with


their efforts to build systems that do exactly that. The opportunities are rich
for two-way collaboration in technology-enhanced human interaction.

ACKNOWLEDGMENT

This work is supported by the National Science Foundation (NSF) SGER


Award IIS-0555411 and by the MIT Media Laboratory Things That Think con-
sortium. Rana el Kaliouby’s doctorate research was supported by the Computer
Laboratory, Cambridge University. SBC was supported by the Shirley Foun-
dation during the Mindreading DVD project. We are grateful to Alea Teeters,
Matthew Goodwin, Ofer Golan, and Peter Robinson for their contribution and
valuable discussion of these ideas.

REFERENCES

AMERICAN PSYCHIATRIC ASSOCIATION. 1994. The Diagnostic and Statistical Manual of


Mental Disorders, IV . Washington, D.C.: American Psychiatric Association.
ASPERGER, H. 1991. Autistic psychopathy in childhood. Pp. 37–92 in U. Frith (ed.),
Autism and Asperger Syndrome. Cambridge, England: Cambridge University
Press.
ATTWOOD, T. 1998. Asperger’s Syndrome: A Guide for Parents and Professionals.
Philadelphia: Jessica Kingsley Publishers.
BARON-COHEN, S. 1988. Social and pragmatic deficits in autism: cognitive or affective?
Journal of Autism and Developmental Disorders 18(3), 379–402.
BARON-COHEN, S. 1995. Mindblindness. Cambridge, MA: MIT Press.
BARON-COHEN, S. 2002. The extreme male brain theory of autism. Trends in Cognitive
Science 6, 248–254.
BARON-COHEN, S. 2003. The Essential Difference: The Truth about the Male and
Female Brain. New York: Basic Books.
BARON-COHEN, S. 2004. Autism: research into causes and intervention. Paediatric
Rehabilitation 7(2), 73–78.
BARON-COHEN, S. 2006. The hyper-systemizing, assortative mating theory of autism.
Progress in Neuro-Psychopharmacology and Biological Psychiatry 30, 865–
872.
BARON-COHEN, S., and S. WHEELWRIGHT. 2004. The Empathy Quotient (EQ). An in-
vestigation of adults with Asperger Syndrome or High Functioning Autism, and
normal sex differences. Journal of Autism and Developmental Disorders 34,
163–175.
BARON-COHEN, S., S. WHEELWRIGHT, J. LAWSON, R. GRIFFIN, and J. HILL. 2002. The
exact mind: empathising and systemising in autism spectrum conditions. Pp.
491–508 in U. Goswami (ed.), Handbook of Cognitive Development. Oxford:
Blackwell.
BISHOP, J. 2003. The Internet for educating individuals with social impairments. Journal
of Computer Assisted Learning 19, 546–556.
244 ANNALS NEW YORK ACADEMY OF SCIENCES

BLACKBURN, J., K. GOTTSCHEWSKI, E. GEORGE, and L. NIKI. 2000. A discussion


about theory of mind: from an autistic perspective. Proceedings of Autism Eu-
rope’s 6th International Congress, Glasgow May 19-21. Available online at
http://www.autistics.org/library/AE2000-ToM.html.
BLOCHER, K., and R.W. PICARD 2002. Affective social quest: emotion recognition ther-
apy for autistic children. Chapter 16 in K. Dautenhahn, A. Bond, L. Canamero,
and B. Edmonds (eds.), Socially Intelligent Agents—Creating Relationships with
Computers and Robots. The Netherlands: Kluwer Academic Publishers.
BOWLBY, J. 1982. Attachment. New York: Basic Books.
BREAZEAL, C. 2002. Designing Sociable Robots. Cambridge, MA: MIT Press.
CHAPMAN, E., S. BARON-COHEN, B. AUYEUNG, R. KNICKMEYER, K. TAYLOR, and
G. HACKETT. 2006. Foetal testosterone and empathy: evidence from the Empathy
Quotient (EQ) and the ‘Reading the Mind in the Eyes’ Test. Social Neuroscience,
forthcoming.
CURE AUTISM NOW. 2006. Innovative technology for autism initiaive. Retrieved
September 1, 2006, from http://www.cureautismnow.org/.
DANDA. 2003. The Developmental Adult Neuro-Diversity Association. Retrieved
September 1, 2006, from http://www.danda.org.uk/pages/about.htm.
DAUTENHAHN, K., A. BOND, L. CANAMERO, and B. EDMONDS, eds. 2002. Socially Intel-
ligent Agents—Creating Relationships with Computers and Robots. The Nether-
lands: Kluwer Academic Publishers.
DAVIS, M.H. 1983. Measuring individual differences in empathy: evidence for a mul-
tidimensional approach. Journal of Personality and Social Psychology 44, 113–
126.
DENNERLEIN, J., T. BECKER, P. JOHNSON, C. REYNOLDS, and R. PICARD 2003.
Frustrating computer users increases exposure to physical factors. Proceed-
ings of the XVth Triennial Congress of the International Ergonomics As-
sociation (IEA 2003), Seoul, Korea, August 24–29. Available online at
http://affect.media.mit.edu/pdfs/03.dennerlein-etal.pdf.
DENNETT, D.C. 1987. The Intentional Stance. Cambridge, MA: MIT Press.
DEPARTMENT OF HEALTH AND HUMAN SERVICES. 2004. Congressional Appropriations
Committee Report on the State of Autism Research. Retrieved August 29, 2006,
from http://www.nimh.nih.gov/autismiacc/congapprcommrep.pdf.
EISENBERG, N. 2000. Empathy and sympathy. Pp. 677–692 in M. Lewis, and J. Haviland-
Jones (eds.), Handbook of Emotions. New York: Guildford Press.
FLETCHER, P.C., F. HAPPE, U. FRITH, S.C. BAKER, R.J. DOLAN, R.S. FRACKOWIAK,
and C.D. FRITH. 1995. Other minds in the brain: a functional imaging
study of “Theory of Mind” in story comprehension. Cognition 57(2), 109–
128.
FRITH, U., ed. 1991. Autism and Apserger Syndrome. Cambridge, England: Cambridge
University Press.
FRITH, U. 2003. Autism: Explaining the Enigma, 2nd ed. Oxford: Blackwell.
GLADWELL, M. 2003. Blink: The Power of Thinking without Thinking. New York: Little
Brown.
GOLAN, O., S. BARON-COHEN, J.J. HILL and Y. GOLAN. 2006. Reading the Mind in
Films—testing recognition of complex emotions and mental states in adults with
and without autism spectrum conditions. Social Neuroscience 1(2), 111–123.
GOODWIN, M.S., J. GRODEN, W.F. VELICER, L.P. LIPSITT, M.G. BARON, S.G. HOFMANN,
and G. GRODEN. 2006. Cardiovascular arousal in individuals with autism. Focus
on Autism and Other Developmental Disabilities 21(2), 100–123.
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 245

HAPPÉ, F. 1966. Studying weak central coherence at low levels: children with autism
do not succumb to visual illusions: a research note. Journal of Child Psychology
and Psychiatry 37, 873–877.
HARRIS, J.C. 2003. Social neuroscience, empathy, brain integration, and neurodevel-
opmental disorders. Physiology and Behavior 79, 525–531.
HEALEY, J., and R.W. PICARD 1998. StartleCam: a cybernetic wearable camera. Pp.
42–49 in Proceedings of the International Symposium on Wearable Computers,
Pittsburgh, October 19–20.
HEALEY, J., and R.W. PICARD. 2005. Detecting stress during real-world driving tasks
using physiological sensors. IEEE Transactions on Intelligent Transportation
Systems 6, 156–166.
HILL, E.L., S. BERTHOZ, and U. FRITH. 2004. Brief report: cognitive processing of own
emotions in individuals with autistic spectrum disorder and in their relatives.
Journal of Autism and Developmental Disorders 34, 229–235.
HIRSTEIN, W., P. IVERSEN, and V.S. RAMACHANDRAN. 2001. Autonomic responses of
autistic children to people and objects. Proceedings of the Royal Society 268,
1883–1888.
JOLLIFFE, T., and S. BARON-COHEN 1997. Are people with autism and Asperger syn-
drome faster than normal on the Embedded Figures Test? Journal of Child Psy-
chology and Psychiatry 38(5), 527–534.
JOSEPH, R., and H. TAGER-FLUSBERG 1997. An investigation of attention and affect in
children with autism and Down syndrome. Journal of Autism and Developmental
Disorders 27(4), 385–396.
EL KALIOUBY, R. 2005. Mind-reading machines: automated inference of complex men-
tal states. Computer Laboratory, University of Cambridge.
EL KALIOUBY, R., and P. ROBINSON. 2005. The emotional hearing aid: an assistive
tool for children with Asperger Syndrome. Universal Access in the Information
Society 4(2), 121–134.
EL KALIOUBY, R., A. TEETERS, and R.W. PICARD. 2006. An exploratory social-emotional
prosthetic for Autism Spectrum Disorders. Body Sensor Networks, MIT Me-
dia Lab. Available online at http://affect.media.mit.edu/pdfs/06.kaliouby-teeters-
picard-bsn.pdf.
KANNER, L. 1943. Autistic disturbance of affective contact. Nervous Child 2, 217–250.
KAPOOR, A., H. AHN, and R.W. PICARD. 2005. Mixture of gaussian processes for com-
bining multiple modalities. Pp. 86–96 in N.C. Oza, R. Polikar, J. Kittler, and F.
Roli (eds.), Multiple Classifier Systems: 6th International Workshop, MCS 2005,
Seaside, CA. Berlin: Springer.
KAPOOR, A., R.W. PICARD, and Y. IVANOV. 2004. Probabilistic combination of
multiple modalities to detect interest. International Conference on Pat-
tern Recognition, Cambridge, U.K., August 23–26. Available online at
http://affect.media.mit.edu/pdfs/04.kapoor-picard-ivanov.pdf.
KLEIN, J., Y. MOON, and R.W. PICARD. 2002. This computer responds to user frustration:
theory, design, results, and implications. Interacting with Computers 14, 119–
140.
KLIN, A., and F.R. VOLKMAR 1995. Asperger’s Syndrome: guidelines for assessment
and diagnosis. Learning Disabilities Association of America. Available online at
http://www.aspennj.org/guide.html.
KLIN, A., W. JONES, R. SCHULTZ, and F. VOLKMAR. 2003. The enactive mind, or from
actions to cognition: lessons from autism. Philosophical Transactions of the
Royal Society B(358), 345–360.
246 ANNALS NEW YORK ACADEMY OF SCIENCES

KLIN, A., W. JONES, R. SCHULTZ, F. VOLKMAR, and D. COHEN. 2002. Visual fixation
patterns during viewing of naturalistic social situations as predictors of social
competence in individuals with autism. Archives of General Psychiatry 59, 809–
816.
LECOUTEUR, A., M. RUTTER, and C. LORD. 1989. Autism diagnostic interview: a stan-
dardized investigator-based instrument. Journal of Autism and Developmental
Disorders 19(3), 363–387.
LEGOFF, D.B. 2004. Use of LEGO c as a therapeutic medium for improving social
competence. Journal of Autism and Developmental Disorders 34(5), 557–571.
LESLIE, A.M. 1987. Pretense and representation: the origins of “theory of mind.” Psy-
chological Review 94(4), 412–426.
LORD, C., M. RUTTER, and A. LE COUTEUR. 1994. Autism diagnostic interview—
revised: a revised version of a diagnostic interview for caregivers of individuals
with possible pervasive developmental disorders. Journal of Autism and Devel-
opmental Disorders 24(5), 659–685.
LORD, C., M.L. RUTTER, S. GOODE, J. HEEMSBERGEN, H. JORDAN, L. MAWHOOD, and E.
SCHOPLER. 1989. Autism diagnostic observation schedule: a standardized obser-
vation of communicative and social behavior.” Journal of Autism and Develop-
mental Disorders 19(2), 185–212.
LORD, C., S. RISI, L. LAMBRECHT, E.H. COOK, Jr., B.L. LEVENTHAL, P.C. DILAVORE, A.
PICKLES, and M. RUTTER. 2000. The Autism Diagnostic Observation Schedule—
generic: a standard measure of social and communication deficits associated with
the spectrum of autism. Journal of Autism and Developmental Disorders 30(3),
205–223.
MEHRABIAN, A., and N. EPSTEIN 1972. A measure of emotional empathy. Journal of
Personality 40, 525–543.
MIXING MEMORY BLOG. 2005. Autism and theory of mind. Retrieved September 1,
2006, from http://mixingmemory.blogspot.com/2005/08/autism-and-theory-of-
mind.html.
MOORE, D., P. MCGRATH, and J. THORPE. 2000. Computer-aided learning for people with
autism—a framework for research and development. Innovations in Education
and Training International 37(3), 218–228.
OMDAHL, B.L. 1995. Cognitive Appraisal, Emotion, and Empathy. Mahwah, NJ:
Lawrence Erlbaum Associates.
PELPHREY, K.A., J.P. MORRIS, and G. MCCARTHY. 2005. Neural basis of eye gaze
processing deficits in autism. Brain 128(5), 1038–1048.
PENTLAND, A. 2006. Are we one? On the nature of human intelligence. Fifth Interna-
tional Conference on Development and Learning, Bloomington, IL, May 31–June
3. Available online at http://web.media.mit.edu/s̃andy/Are-We-One-2-13-06.pdf.
PICARD, R. 1997. Affective Computing. Cambridge, MA: MIT Press.
PICARD, R., and J. SCHEIRER 2001. The Galvactivator: a glove that senses and com-
municates skin conductivity. Pp. 1538–1542 in Proceedings of the International
Conference on Human-Computer Interaction, New Orleans, August.
PICARD, R.W., and C. DU. 2002. Monitoring stress and heart health with a phone
and wearable computer. Motorola Offspring Journal 1. Available online at
http://affect.media.mit.edu/pdfs/02.picard-du.pdf.
PICARD, R.W., and J. HEALEY 1997. Affective wearables. Personal Technologies 1(4),
231–240.
PICARD, R.W., S. PAPERT, W. BENDER, B. BLUMBERG, C. BREAZEAL, D. CAVALLO, T.
MACHOVER, M. RESNICK, D. ROY, and C. STROHECKER. 2004. Affective learning—
a Manifesto. BT Technical Journal 22(4), 253–269.
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 247

POST-AUTISTIC ECONOMICS NETWORK. 2000. Post-autistic economics review. Retrieved


September 6, 2006, from http://www.paecon.net/.
PRATT, C., and P. BRYANT 1990. Young children understand that looking leads to
knowing (so long as they are looking into a single barrel). Child Development
61(4), 973–982.
QI, Y., and R.W. PICARD 2002. Context-Sensitive Bayesian Classifiers and Applications
to Mouse Pressure Pattern Classification. Proceedings of the 16th International
Conference on Pattern Recognition (ICPR’02), Quebec City, Canada.
QI, Y., C. REYNOLDS, and R.W. PICARD. 2001. The Bayes point machine for computer-
user frustration detection via pressure mouse. Proceedings of the 2001 Workshop
on Perceptive User Interfaces, Orlando, FL, November 15–16.
REYNOLDS, C.J. 2005. Adversarial Uses of Affective Computing and Ethi-
cal Implications. Media Arts and Sciences. Doctoral dissertation, Cam-
bridge, MA: Massachusetts Institute of Technology. Available online at
http://alumni.media.mit.edu/∼carsonr/phd thesis/index.html.
REYNOLDS, C., and R.W. PICARD 2004. Affective sensors, privacy and ethical con-
tracts. Pp. 1103–1106 in Proceedings of the Conference on Human Factors in
Computing Systems, April 24–29, Vienna, Austria. New York: ACM.
RUBIN, P., M. HIRSCHBEIN, T. MASCIANGIOLI, T. MILLER, C. MURRAY, R.L. NORWOOD,
and J. SARGENT. 2003. The Communicator: enhancement of group communica-
tion, efficiency, and creativity. Pp. 302–307 in M.C. Roco, and W.S. Bainbridge
(eds.), Converging Technologies for Improving Human Performance: Nanotech-
nology, Biotechnology, Information Technology And Cognitive Science. Dor-
drecht, Netherlands: Kluwer.
SCASSELLATI, B. 2005. How social robots will help us to diagnose, treat, and
understand autism. 12th International Symposium of Robotics Research
(ISRR), October 12-15, San Francisco, CA. Available online at http://cs-
www.cs.yale.edu/homes/scaz/papers/Scassellati-ISRR-05-final.pdf.
SCHEIRER, J., R. FERNANDEZ, and R.W. PICARD. 1999. Expression glasses: a wearable
device for facial expression recognition. Pp. 262–263 in Proceedings of the Con-
ference on Human Factors in Computing Systems, May 15–20, Pittsburgh, PA.
New York: ACM.
SHAH, A., and U. FRITH. 1983. An islet of ability in autistic children: a research note.
Journal of Child Psychology and Psychiatry 24(4), 613–20.
SKUSE, D.H. 1997. Genetic factors in the etiology of child psychiatric disorders. Current
Opinion in Pediatrics 9(4), 354–360.
SPIRO, H. 1993. Empathy: an introduction. Pp. 1–6 in H. Spiro, M. McCrea, E. Peschel,
and D. St. James (eds.), Empathy and the Practice of Medicine. New Haven,
Connecticut: Yale University Press.
TEETERS, A., R. EL KALIOUBY, and R.W. PICARD. 2006. Self-Cam: feedback
from what would be your social partner. Proceedings of the 33rd In-
ternational Conference on Computer Graphics and Interactive Techniques
(SIGGRAPH), July 30–August 3, 2006, Boston, MA. Available online at
http://affect.media.mit.edu/pdfs/06.teeters-kaliouby-picard-siggraph.pdf.
VERTEGAAL, R., R. SLAGTER, G. VAN DER VEER, and A. NIJHOLT. 2001. Eye gaze pat-
terns in coversations: there is more to conversational agents than meets the eyes.
Pp. 301–308 in Proceedings of the SIGCHI Conference on Human Factors in
Computing Systems. New York: ACM.
VOLKMAR, F., and A. KLIN 2000. Asperger’s disorder and higher functioning autism:
same or different? International Review of Research in Mental Retardation 23,
83–110.
248 ANNALS NEW YORK ACADEMY OF SCIENCES

VOLKMAR, F.R., and L. MAYES. 1991. Gaze behaviour in autism. Development and
Psychopathology 2, 61–69.
WEGNER, P. 1997. Why interaction is more powerful than algorithms. Communications
of the ACM 40(5), 80–91.
WELLMAN, H.M. 1992. The Child’s Theory of Mind. Cambridge, MA: MIT Press.
WHITEN, A., ed. 1991. Natural Theories of Mind: Evolution, Development, and Simu-
lation of Everyday Mindreading. Cambridge, MA: B. Blackwell.
WIMMER, H., and J. PERNER. 1983. Beliefs about beliefs: representation and constrain-
ing function of wrong beliefs in young children’s understanding of deception.
Cognition 13(1), 103–28.
WURTMAN, J.J., and M. DANBROT. 1988. Managing Your Mind and Mood Through
Food. New York: Perennial Library.
ZACKS, J.M., B. TVERSKY, and G. IYER. 2001. Perceiving, remembering, and commu-
nicating structure in events. Journal of Experimental Psychology 130(1), 29–58.

Você também pode gostar