Escolar Documentos
Profissional Documentos
Cultura Documentos
Address for correspondence: Rosalind W. Picard, Sc.D., FIEEE, MIT Media Laboratory, E15-448,
20 Ames Street, Cambridge, MA 02142-1308. Voice: 617-253-0611; fax: 617-253-5922.
e-mail: picard@media.mit.edu
doi: 10.1196/annals.1382.016
228
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 229
social interaction and lacking people intuition. These difficulties vary with the
severity of the condition, and include difficulty reading other peoples’ non-
verbal cues and mental states (Joseph and Tager-Flusberg 1997; Frith 2003),
atypical gaze processing (Volkmar and Mayes 1991; Klin et al. 2002; Pelphrey
et al. 2005), restricted emotional expression (Hill et al. 2004), difficulties
gauging the interests of others in conversation (Fletcher et al. 1995; Volkmar
and Klin 2000), and frequently launching into monologues about narrowly de-
fined and often highly technical interests, such as railway tables or maps (Klin
and Volkmar 1995).
Over the past 10 years, researchers in affective computing (Picard 1997) have
begun to develop technologies that advance our understanding of or approach
to affective neuroscience and autism. Affective computing has contributed to
these fields in at least 4 ways: (i) designing novel sensors and machine learning
algorithms that analyze multimodal channels of affective information, such as
facial expressions, gaze, tone of voice, gestures, and physiology; (ii) creating
new techniques to infer a person’s affective or cognitive state (e.g., confu-
sion, frustration, stress, interest, and boredom); (iii) developing machines that
respond affectively and adaptively to a person’s state; and (iv) inventing per-
sonal technologies for improving awareness of affective states and its selective
communication to others.
While much of the work in affective computing has been motivated by the
goal of giving future robots and computational agents socioemotional skills,
its researchers have also recognized that they face similar challenges to those
who try to help people with autism improve such skills. Computers, like most
people with autism, do not naturally have the ability to interpret socioaffective
cues, such as tone of voice or facial expression. Similarly, computers do not
naturally have common sense about people and the way they operate. When
people or machines fail to perceive, understand, and act upon socioemotional
cues, they are hindered in their ability to decide when to approach someone,
when to interrupt, or when to wind down an interaction, reducing their ability
to interact with others. A large part of natural learning involves reading and
responding to socioemotional cues, so this deficit also interferes with the ability
to learn from others. The field of affective computing aims to change the nature
of technology so that it can sense, respond, and communicate this information.
In so doing, the field has a lot to learn from people with autism, from progress
they have made, and from the friends, families, and staff who work with these
individuals. We should point out that we are not using autism as a metaphor,
unlike the postautistic economics network (Post-Autistic Economics Network
2000) or Wegner’s (1997) description of autistic algorithms. Our use of autism
is restricted to the clinical definition.
AFFECT SENSING
posture, to infer affective states, such the level of engagement (interest versus
boredom), of learners (Kapoor et al. 2004, 2005).
Another challenge is generalization. Generalization is the capacity to apply
knowledge from one context to new contexts. In autism, it is uncertain whether
with existing interventions, individuals are able to successfully transfer the
knowledge they acquire. Computers also have problems generalizing from the
examples they were trained on, to the analysis of new unseen information.
The field of machine learning is perpetually trying to improve the ability
of computers to generalize. With the advent of more robots and agents that
will interact with people, there is increased interest in enabling machines to
learn better by learning from people in natural colearning situations, not just
from people who “program” the computer or robot (Breazeal 2002). But such
learning again requires socioemotional skills, such as the ability to see if the
person teaching you is shaking their head and frowning.
As part of what we call the socioemotional intelligence prosthesis, we are
exploring new kinds of systems that learn with people through natural interac-
tion (el Kaliouby and Robinson 2005; el Kaliouby et al. 2006). The intelligent
system we aim to build is a colearner with the person with ASC in trying to
learn how to recognize and respond to socioemotional cues (Picard et al. 2004).
One possibility for such a system is to exploit this common learning goal and
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 237
perhaps play games with the individual, to assist him or her with continuously
learning to generalize, occasionally bringing in non-ASC experts for corrective
feedback and validation. The non-ASC person(s) could be present physically,
or remotely connecting in through the technology, to help with the learning
process. Social or emotion tagging, a situation where the parents and/or care-
givers of a child with autism accompany the child and “tag” events with social
labels, is a promising approach albeit expensive and impractical. Through the
use of a head-mounted wearable camera/microphone, parents and caregivers
could “eyejack” the child’s visual field and tag the world remotely. This is both
practical and cost-effective; it allows the child to be more independent, while
continuing to enable parents and caregivers to share experiences with the child
and help with learning.
Many persons with ASC prefer to communicate with and through computers
because they are predictable and place some control on the otherwise chaotic
social world (Moore et al. 2000). How can we harness their interest in technol-
ogy to systemize the social world? For young children and those at the lower
end of the autism spectrum, sociable robotics and dolls are a good approach
to helping social interaction skills. The use of robots allows for a simplified,
predictable, and reliable environment where the complexity of interaction can
be controlled and gradually increased. It is also more realistic and engaging
than interacting with a screen. The Affective Social Quotient project is one
of the early projects at MIT Media Lab to develop assistive technologies for
autism using physical input devices, namely four dolls (stuffed dwarfs), which
appeared to be happy, angry, sad, or surprised (Blocher and Picard 2002). The
system would play short digital videos that embody one of the four emotions,
and then encourage the child to choose the dwarf that went with the appropri-
ate emotion. When the child picked up the stuffed toy, the system identified
its infrared signal and responded. Use of the dolls as physical input devices
also encouraged development of joint attention and turn-taking skills, because
typically another person was present during the session. Other robot platforms
have been used for autism intervention, encouraging social behavior, such as
turn-taking and shared attention (Dautenhahn et al. 2002; Scassellati 2005).
Robotics may also be useful for individuals at the higher end of the autism
spectrum, who would need help with the subtle, real-time social interactions.
One can imagine a variation of LEGO—already known to be helpful as an
intervention in autism (LeGoff 2004)—that combines rules and mechanics to
allow for social explorations. Robotics could also be used by groups of chil-
dren for improvisation, and directing play, encouraging turn-taking between
children.
Affect sensing and affect recognition are technologies that are readily appli-
cable to autism interventions. Affect sensing and recognition technologies can
238 ANNALS NEW YORK ACADEMY OF SCIENCES
help increase self-awareness, and provide novel ways for self-monitoring. One
of the first problems we encountered when having a person wear a camera with
software to interpret the facial expressions of a conversational partner was that
the person with ASD might not even look at the face of the other person. Thus,
the wearable camera might point at the floor or at a shirt pocket, or a nearby
object instead of at the face that needs to be read. One possible solution is
via a device, such as the eye contact sensing glasses (Vertegaal et al. 2001),
wearable glasses that recognize when a user is in eye contact with another
person. These glasses can be used to measure the magnitude and dynamics
of eye contact in people with ASC. These patterns can then be compared to
eye contact in people without ASC. It can also be used as an intervention to
encourage people with ASC to pay more attention to the face.
In some cases there are privacy concerns with wearing a camera that records
those around you. Out of such concerns, Alea Teeters in our lab developed
the self-cam shown in FIGURE 3 (Teeters et al. 2006). Self-Cam is a small,
lightweight video camera that is worn over the chest and points at one’s face.
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 239
According to the E-S theory of sex differences, there are at least three types
of brain, derived from two orthogonal dimensions—empathizing and system-
izing (Baron-Cohen 2002), diagrammed in FIGURE 4 (numbers are standard
deviations from the mean). The first is characterized by systemizing being
stronger than empathizing, a profile more common in males. The second type
has the profile of systemizing and empathizing being balanced. The third
240 ANNALS NEW YORK ACADEMY OF SCIENCES
Similarly, a person who is preoccupied may fail to notice the nonverbal cues
of others, misreading their mental state.
The technologies that improve empathizing in autism should in theory also
contribute to improvements in these skills in the general population. As Mal-
colm Gladwell shows in Blink, a person’s knowledge base of people intuition
can be broadened, affecting one’s ability to make accurate snap judgments
(Gladwell 2003). Technology may augment people’s capacity to empathize and
improve their people intuition (whether or not they are diagnosed with autism)
in at least three ways: increased self-awareness, improved communication with
others, and better social learning. For instance, a wearable system that contin-
uously measures stress or anxiety signals can help the wearer regulate arousal,
raising self-awareness and encouraging people to switch perspectives under
conditions of high arousal. Another application is a personal anger manage-
ment wearable system that would detect states, such as anger, and attempt to
calm the wearer, perhaps even through empathizing verbally with the wearer
(Klein et al. 2002). Technologies that sense various aspects of the person’s
affective and physiological state can be used for self-monitoring. Making this
knowledge available in a simplified and easy to visualize manner is a good
motivational factor to change habits. (It has been shown for instance, that daily
self-weighing is a strong motivational factor for losing weight).
This information about oneself could also be selectively communicated to
others to enhance group communication. An example is the Communicator
system (Rubin et al. 2003) that uses a combination of nano/info technologies
that allow individuals to carry with them electronically stored information
about themselves, such as interests, background, and affective state, that can
be broadcasted as needed in group situations. Participants would have the
ability to define or restrict the kinds of information about themselves that they
would be willing to share with other members of the group.
ETHICAL CONSIDERATIONS
Along with the potential benefits, we recognize that there are important
ethical considerations that arise with the development of these technologies.
An exhaustive discussion of these ethical considerations is beyond the scope
of this article; instead, we will suffice with a few examples that highlight the
importance of being sensitive to the needs of the end-users of this technology,
be they autism researchers and practitioners, or individuals diagnosed with
autism and their families.
Besides the privacy issues of sensing and broadcasting affective state in-
formation (Reynolds and Picard 2004), one issue to consider is whether in-
dividuals with autism need treatment or technology “fixes” at all. We agree
that ASC involve a different cognitive style, allowing many individuals with
autism to focus deeply on a given subject, which can lead to original thought.
242 ANNALS NEW YORK ACADEMY OF SCIENCES
We thus prefer to design technologies that do not try to “fix” people, but rather
that can be used by individuals to augment or further develop their natural
abilities. If these new technologies hold the promise of improving empathy,
this should only be undertaken with the individual’s consent, where it is pos-
sible to obtain it, or with their parent’s consent in the case of a young child.
Unlike medical interventions where there is a risk of unwanted side effects,
affective computing based interventions may have highly specific effects (on
empathy) while leaving other domains (e.g., systemizing) unaffected. We adopt
a user-centered design and development approach to ensure that individuals
with autism and their caregivers are involved in the development phases of
intervention technologies that they need the most.
Another ethical consideration is whether exposing affective state informa-
tion creates opportunities for others to manipulate one’s behavior and thoughts
using this information (see Reynolds (2005) for examples). Even in situations
where the use of technology is honest, there are still potential concerns. If
an individual with autism wears an assistive system that senses the affective
state of others, then this would raise the expectations of interaction partners,
increasing (rather than decreasing) the social pressures on the person with
autism to respond to these cues in real time. Such a system might be more
burdensome than helpful. It is essential that researchers address these con-
siderations and explore the potential opportunities brought by a convergence
in autism research and affective computing with open-mindedness about the
possible successes or failures of such an approach.
CONCLUSION
ACKNOWLEDGMENT
REFERENCES
HAPPÉ, F. 1966. Studying weak central coherence at low levels: children with autism
do not succumb to visual illusions: a research note. Journal of Child Psychology
and Psychiatry 37, 873–877.
HARRIS, J.C. 2003. Social neuroscience, empathy, brain integration, and neurodevel-
opmental disorders. Physiology and Behavior 79, 525–531.
HEALEY, J., and R.W. PICARD 1998. StartleCam: a cybernetic wearable camera. Pp.
42–49 in Proceedings of the International Symposium on Wearable Computers,
Pittsburgh, October 19–20.
HEALEY, J., and R.W. PICARD. 2005. Detecting stress during real-world driving tasks
using physiological sensors. IEEE Transactions on Intelligent Transportation
Systems 6, 156–166.
HILL, E.L., S. BERTHOZ, and U. FRITH. 2004. Brief report: cognitive processing of own
emotions in individuals with autistic spectrum disorder and in their relatives.
Journal of Autism and Developmental Disorders 34, 229–235.
HIRSTEIN, W., P. IVERSEN, and V.S. RAMACHANDRAN. 2001. Autonomic responses of
autistic children to people and objects. Proceedings of the Royal Society 268,
1883–1888.
JOLLIFFE, T., and S. BARON-COHEN 1997. Are people with autism and Asperger syn-
drome faster than normal on the Embedded Figures Test? Journal of Child Psy-
chology and Psychiatry 38(5), 527–534.
JOSEPH, R., and H. TAGER-FLUSBERG 1997. An investigation of attention and affect in
children with autism and Down syndrome. Journal of Autism and Developmental
Disorders 27(4), 385–396.
EL KALIOUBY, R. 2005. Mind-reading machines: automated inference of complex men-
tal states. Computer Laboratory, University of Cambridge.
EL KALIOUBY, R., and P. ROBINSON. 2005. The emotional hearing aid: an assistive
tool for children with Asperger Syndrome. Universal Access in the Information
Society 4(2), 121–134.
EL KALIOUBY, R., A. TEETERS, and R.W. PICARD. 2006. An exploratory social-emotional
prosthetic for Autism Spectrum Disorders. Body Sensor Networks, MIT Me-
dia Lab. Available online at http://affect.media.mit.edu/pdfs/06.kaliouby-teeters-
picard-bsn.pdf.
KANNER, L. 1943. Autistic disturbance of affective contact. Nervous Child 2, 217–250.
KAPOOR, A., H. AHN, and R.W. PICARD. 2005. Mixture of gaussian processes for com-
bining multiple modalities. Pp. 86–96 in N.C. Oza, R. Polikar, J. Kittler, and F.
Roli (eds.), Multiple Classifier Systems: 6th International Workshop, MCS 2005,
Seaside, CA. Berlin: Springer.
KAPOOR, A., R.W. PICARD, and Y. IVANOV. 2004. Probabilistic combination of
multiple modalities to detect interest. International Conference on Pat-
tern Recognition, Cambridge, U.K., August 23–26. Available online at
http://affect.media.mit.edu/pdfs/04.kapoor-picard-ivanov.pdf.
KLEIN, J., Y. MOON, and R.W. PICARD. 2002. This computer responds to user frustration:
theory, design, results, and implications. Interacting with Computers 14, 119–
140.
KLIN, A., and F.R. VOLKMAR 1995. Asperger’s Syndrome: guidelines for assessment
and diagnosis. Learning Disabilities Association of America. Available online at
http://www.aspennj.org/guide.html.
KLIN, A., W. JONES, R. SCHULTZ, and F. VOLKMAR. 2003. The enactive mind, or from
actions to cognition: lessons from autism. Philosophical Transactions of the
Royal Society B(358), 345–360.
246 ANNALS NEW YORK ACADEMY OF SCIENCES
KLIN, A., W. JONES, R. SCHULTZ, F. VOLKMAR, and D. COHEN. 2002. Visual fixation
patterns during viewing of naturalistic social situations as predictors of social
competence in individuals with autism. Archives of General Psychiatry 59, 809–
816.
LECOUTEUR, A., M. RUTTER, and C. LORD. 1989. Autism diagnostic interview: a stan-
dardized investigator-based instrument. Journal of Autism and Developmental
Disorders 19(3), 363–387.
LEGOFF, D.B. 2004. Use of LEGO c as a therapeutic medium for improving social
competence. Journal of Autism and Developmental Disorders 34(5), 557–571.
LESLIE, A.M. 1987. Pretense and representation: the origins of “theory of mind.” Psy-
chological Review 94(4), 412–426.
LORD, C., M. RUTTER, and A. LE COUTEUR. 1994. Autism diagnostic interview—
revised: a revised version of a diagnostic interview for caregivers of individuals
with possible pervasive developmental disorders. Journal of Autism and Devel-
opmental Disorders 24(5), 659–685.
LORD, C., M.L. RUTTER, S. GOODE, J. HEEMSBERGEN, H. JORDAN, L. MAWHOOD, and E.
SCHOPLER. 1989. Autism diagnostic observation schedule: a standardized obser-
vation of communicative and social behavior.” Journal of Autism and Develop-
mental Disorders 19(2), 185–212.
LORD, C., S. RISI, L. LAMBRECHT, E.H. COOK, Jr., B.L. LEVENTHAL, P.C. DILAVORE, A.
PICKLES, and M. RUTTER. 2000. The Autism Diagnostic Observation Schedule—
generic: a standard measure of social and communication deficits associated with
the spectrum of autism. Journal of Autism and Developmental Disorders 30(3),
205–223.
MEHRABIAN, A., and N. EPSTEIN 1972. A measure of emotional empathy. Journal of
Personality 40, 525–543.
MIXING MEMORY BLOG. 2005. Autism and theory of mind. Retrieved September 1,
2006, from http://mixingmemory.blogspot.com/2005/08/autism-and-theory-of-
mind.html.
MOORE, D., P. MCGRATH, and J. THORPE. 2000. Computer-aided learning for people with
autism—a framework for research and development. Innovations in Education
and Training International 37(3), 218–228.
OMDAHL, B.L. 1995. Cognitive Appraisal, Emotion, and Empathy. Mahwah, NJ:
Lawrence Erlbaum Associates.
PELPHREY, K.A., J.P. MORRIS, and G. MCCARTHY. 2005. Neural basis of eye gaze
processing deficits in autism. Brain 128(5), 1038–1048.
PENTLAND, A. 2006. Are we one? On the nature of human intelligence. Fifth Interna-
tional Conference on Development and Learning, Bloomington, IL, May 31–June
3. Available online at http://web.media.mit.edu/s̃andy/Are-We-One-2-13-06.pdf.
PICARD, R. 1997. Affective Computing. Cambridge, MA: MIT Press.
PICARD, R., and J. SCHEIRER 2001. The Galvactivator: a glove that senses and com-
municates skin conductivity. Pp. 1538–1542 in Proceedings of the International
Conference on Human-Computer Interaction, New Orleans, August.
PICARD, R.W., and C. DU. 2002. Monitoring stress and heart health with a phone
and wearable computer. Motorola Offspring Journal 1. Available online at
http://affect.media.mit.edu/pdfs/02.picard-du.pdf.
PICARD, R.W., and J. HEALEY 1997. Affective wearables. Personal Technologies 1(4),
231–240.
PICARD, R.W., S. PAPERT, W. BENDER, B. BLUMBERG, C. BREAZEAL, D. CAVALLO, T.
MACHOVER, M. RESNICK, D. ROY, and C. STROHECKER. 2004. Affective learning—
a Manifesto. BT Technical Journal 22(4), 253–269.
EL KALIOUBY et al.: AFFECTIVE COMPUTING AND AUTISM 247
VOLKMAR, F.R., and L. MAYES. 1991. Gaze behaviour in autism. Development and
Psychopathology 2, 61–69.
WEGNER, P. 1997. Why interaction is more powerful than algorithms. Communications
of the ACM 40(5), 80–91.
WELLMAN, H.M. 1992. The Child’s Theory of Mind. Cambridge, MA: MIT Press.
WHITEN, A., ed. 1991. Natural Theories of Mind: Evolution, Development, and Simu-
lation of Everyday Mindreading. Cambridge, MA: B. Blackwell.
WIMMER, H., and J. PERNER. 1983. Beliefs about beliefs: representation and constrain-
ing function of wrong beliefs in young children’s understanding of deception.
Cognition 13(1), 103–28.
WURTMAN, J.J., and M. DANBROT. 1988. Managing Your Mind and Mood Through
Food. New York: Perennial Library.
ZACKS, J.M., B. TVERSKY, and G. IYER. 2001. Perceiving, remembering, and commu-
nicating structure in events. Journal of Experimental Psychology 130(1), 29–58.