Você está na página 1de 4

Human Body as Touchscreen Replacement

http://www.nngroup.com/articles/human-body-touch-input/

Nielsen Norman Group


Evidence-Based User Experience Research, Training,
and Consulting

HOME

REPORTS

Topics
E-commerce
Intranets
Mobile & Tablet
User Testing
Web Usability
See all topics

Author
Jakob Nielsen
Don Norman
Bruce "Tog" Tognazzini
See all authors

Recent Articles
How to Run a Usability
Test with Users Who
Are on Your Site Now
Smartwatches Are the
FutureBut Samsung
Galaxy Gear Only
Partway There
State the Price to Give
B2B Sites a Competitive
Advantage
Suggested-Employee
SearchThe Best
Intranet Design
TodayCould Save
Your Organization Half a
Million Dollars

TRAINING

CONSULTING

ARTICLES

ABOUT NN/G

The Human Body as Touchscreen Replacement


by JAKOB NIELSEN on July 22, 2013

Topics: Human Computer Interaction Trip Reports

Summary: When you touch your own body, you feel exactly what
you touch better feedback than any external device. And you
never forget to bring your body.
Many new user interfaces go beyond a flat computer screen as the locus of the user
experience. For example, Microsoft Kinect makes the entire room into the interaction
space, with the user's body movements becoming the system commands. Augmented
reality systems (such as some Google Glass applications) project their output onto
real-world objects such as using temperature readouts to color-code airplane engine
parts so that theyre easier for repair technicians to see.
At the recent CHI 2013 research conference in Paris, I was particularly impressed with
two ideas to use the human body itself as an integrated component of the user
interface. The user's own body is unique relative to all other devices in one key aspect:
you can feel when your body is being touched.
This sense of being touched (or not) provides important feedback that continuously
informs users about what is happening. Because feedback is one of the oldest and
most important usability heuristics, enhancing it is usually good.

The Hand as Input Device


Sean Gustafson, Bernhard Rabe, and Patrick Baudisch from the Hasso Plattner
Institute in Germany designed a so-called imaginary interface situated within the
palm of the user's hand. This UI is "imaginary" in the sense that there's nothing actually
there beyond the naked hand. The photo below shows how an imaginary "mobile
phone" could be fitted onto the user's left hand. As each point is touched, a specific
mobile function would be activated and announced by a computerized voice.

Seamlessness in the
Cross-Channel User
Experience
See all articles

Popular Articles
Usability 101:
Introduction to Usability
Top 10 Mistakes in Web
Design
How Users Read on the
1 de 4
Web

10/12/13 10:59

Touching specific spots on your own hand enters the commands.

For this system to actually work, therehttp://www.nngroup.com/articles/human-body-touch-input/


must be some way for users to hear the
Subscribe
to:
Human
Body as Touchscreen
Replacement
computer, such as using an earbud. More important, there must be a way for the
Jakob Nielsen's Alertbox
computer to know what part of the hand the user is touching. In Gustafson's research
Newsletter
prototype, this was done with an optical motion tracker that seemed too clunky for
Don Norman's JND.org
real-world applications. But thats okay; we can easily imagine advances in gesture
RSS feed
recognition that would be more portable and less intrusive.
Bruce "Tog" Tognazzini's
Asktog.com
Although you wont be able to buy a hand-phone anytime soon, it's definitely
interesting to consider the potential of interfaces where users touch themselves instead
of a screen.
Gustafson and colleagues performed several interesting experiments to determine how
well people can use their self-touch UI. Under normal use, people were about equally
fast selecting functions from a regular touchscreen phone and from the palm-based
system. However, blindfolded users were almost twice as fast when touching
themselves as when touching the glass surface of the phone.
Obviously, we don't want to blindfold users, though information about nonsighted use is
interesting for accessibility reasons and for conditions in which people can't look at the
phone.
The most interesting aspect of the finding about blindfolded use is that there is
something special about touching a hand rather than a phone that makes
users depend less on their sight. To tease out why, the researchers tested several
additional conditions:
A phone that provided tactile feedback when touched rather than a stiff pane of
glass. Using the tactile phone, users were 17% faster, though the difference
wasn't statistically significant given the study's sample size.
Having users wear a finger cover to remove the fingers sense of touch. This
made no appreciable difference.
Having users touch a fake hand, rather than their own, to remove the palms
sense of touch. This condition slowed users down by 30%.
Combining these findings, its clear that the key benefit of using the hand as a
"touchscreen" is that you can feel when and where youre being touched. Indeed,
that's a unique benefit of using your own body as an input device a benefit that
external devices cant replicate.

The Ear as Input Device


Usually, we use our ears to listen. In the terminology of humancomputer interaction,
this means that the ears are used to consume output from the computer.
But the ears surface can also be used for input to communicate commands from
the user to the computer.
Roman Lissermann, Jochen Huber, Aristotelis Hadjakos, and Max Mhlhuser from
the Technical University of Darmstadt (also in Germany) presented a research
prototype called "EarPut" to do just that. Among other benefits, your ear is always in
the same place; touching your ear is also slightly less obtrusive than touching your
hand.
Possible interactions include:
Touching part of the ear surface, with either single- or multi-touch.
Tugging an earlobe. This interaction is particularly suited for onoff commands,
such as muting a music player.
2 de 4

Sliding a finger up or down along the ear arc. This might work well for10/12/13
adjusting
10:59
volume up or down.

natural gesture for "mute."


Human Body as Touchscreen Replacement Covering the ear certainly ahttp://www.nngroup.com/articles/human-body-touch-input/
To measure how precisely people can touch their own ears in a simple single-touch
interaction, Lissermann and colleagues instrumented 27 users' ears. When they
divided the ear arc into only 2 regions, participants achieved 99% accuracy. When a
3rd region was introduced, however, accuracy dropped substantially: people were still
highly accurate when touching the top or bottom of their ear, but only 63% accurate
when touching the middle.
Although 63% accuracy sounds good after all, its better than half its
unacceptable for most user interface commands. Just think about using the ear to
activate the 3 most common email commands: reply to sender, reply to all, and
forward. Would you want to send your message to the wrong recipients 1/3 of the
time?
As this research shows, ear-driven input is best for situations with an extremely limited
number of commands. It might also be useful for applications in which accidentally
executing a neighboring command is not a big deal; even when dividing the ear arc
into 6 regions, users still achieved fairly high accuracy.

EarPut prototype from the Technical University of Darmstadt.


As the above image shows, in this early research, the prototype hardware is somewhat
reminiscent of the Borg from Star Trek, and most people wouldn't want to wear it on
their ears unless they were paid study participants. But its easy to imagine smaller,
lighter, and more elegant hardware in the future.

Ubiquitous User Interfaces


In addition to offering nearly fail-proof feedback, using body parts as input devices also
has another distinct advantage: the device is literally always with you because
your body is you.
Yes, people often carry their mobile phones, but they'll never be without their hands or
their ears. Thus they'll never be without system functions that have been assigned to
their hands or ears.

3 de 4

Of course, this statement is true only if users are within range of a sensor that lets the
computer know when theyre touching a designated body part. So, maybe you do have
to carry around a small device attached to your ear or maybe in the future,
body-based interaction could be mediated through nanobots that you swallow once
and for all. Another option is to saturate the environment with surveillance cameras,
though (currently, at least) many people would oppose this for privacy reasons.
10/12/13 10:59
Although these technical obstacles remain to be solved, it's reasonable to expect that

user interfaces might be at least partlyhttp://www.nngroup.com/articles/human-body-touch-input/


body-based in 20 or 30 years.
Human Body as Touchscreen Replacement

Learn More
Training Courses

Articles

UX Basic Training

Usability 101: Introduction to Usability

Interaction Design: 3-Day Course

Mental Models

User Interface Principles Every Designer

10 Usability Heuristics for User Interface

Must Know

Design
Form Design Quick Fix: Group Form
Elements Effectively Using White Space
Interaction Cost: Definition

4 de 4

10/12/13 10:59

Você também pode gostar