Escolar Documentos
Profissional Documentos
Cultura Documentos
Creating Moral
Buffers in Weapon
Control
Interface Design
M.L. CUMMINGS
28
0278-0079/04/$20.002004IEEE
FALL 2004
ed actions. Thomas Sheridan, a noted researcher in the field of supervisory behavior and automation, has
expressed concern that operators
interfacing with technology could
have the tendency to trust technology
without question and abandon responsibility for their own actions [17].
This paper will examine the ethical and social issues surrounding
the design of human-computer
interfaces that are designed for control of highly autonomous weapons
systems. With the improvements of
global positioning satellite communications and control algorithms, it
is currently possible to launch
weapons such as medium and
short-range missiles and redirect
them during flight in a matter of
minutes to emerging targets. These
systems are highly autonomous in
that they do not need constant
human supervision and manual
control for operation and indeed,
only require human intervention
when a change in a goal state is
required. In addition, in the not-sodistant future, the United States
military envisions that it will be
able to deploy swarms of flying
robots for reconnaissance and prosecution of potential threats. In the
swarming vision [19], a group of
highly autonomous unmanned aerial vehicles (UAVs) will be able to
communicate amongst themselves
to determine the best course of
action with cursory input from
human agents. In the swarm concept, human decision makers are
removed even further from the decision-making loop.
The implementation of these current and future smart weapons systems means that not only will battlefield commanders have more
flexibility and options; it also means
that a more abstract layer of human
cognitive control will be needed
where none previously existed. In
place of manually controlling or constantly monitoring a weapon state,
commanders will now be asked to
make near-instantaneous decisions
about networks of smart weapons
FALL 2004
that can easily generate more information than a human can process. As
the human is further removed from
direct manual control of a system,
the decision making process
becomes more abstract and often difficult for the human to grasp. For
example, currently UAVs are flown
by a pilot and this direct manual control allows the pilot to be more in
tune with the intentional states
because the human is generating
them. However, as autonomous control improves, it will be possible for a
human to monitor perhaps two or
three UAVs simultaneously (indeed
this is a current goal of the U.S. military). In this case, the pilot will only
intervene in the case of system failure or due to a need to redirect the
weapons to respond to changing battlefield conditions. Because the pilot
is further removed from the control
scheme, it is more difficult for the
pilot to determine quickly how to
intervene in the case of an unanticipated event. This difficulty in understanding intentional states will only
become more difficult with swarming autonomous vehicles that make
decision intra-swarm and only
include the human for extremely
high-level decisions, such as final
approval for weapons release.
Highly autonomous weapons
provide the military with undeniable tactical advantage. However,
developing technologies of this sort
also have the potential to provide
for the creation of moral buffers
that allow humans to act without
adequately considering the consequences. I argue that when directing highly autonomous weapons
through a human-computer interface that provides a virtual userfriendly world, moral buffers can be
more easily created as a consequence of both psychological distancing and compartmentalization.
Indeed, this sense of both physical
and emotional distancing can be
exacerbated by any automated system that provides a division
between a user and his or her
actions, which could become more
|
29
30
High
Sexual Range
Hand-to-Hand Combat Range
Knife Range
Resistance to Killing
Bayonet Range
Close Range (Pistol/Rifle)
Handgrenade Range
Long Range
(Sniper, Anti-Armor, Missles, etc.)
Max Range
(Bomber, Artillery)
Low
Close
Far
FALL 2004
FALL 2004
31
FALL 2004
Designer Awareness
Engineers should be aware of the
potential to create a moral buffer
when designing weapons that require
a very quick human decision, and be
very careful when adding elements
that make a computer interface more
like a form of entertainment than an
interface that will be responsible for
lost lives. In addition, if computers
are seen as moral agents (i.e., I was
only following the recommendations
of the automation), the temptation
may exist to succumb to automation
bias and use highly autonomous
weapons systems in a speedy and
reckless manner and not with the
same forethought that was required
in older, less user-friendly systems.
In his evaluation of the ethics of
computer systems design, Brey [1]
FALL 2004
Author Information
The author is with the Massachusetts Institute of Technology, 77
Massachusetts Avenue, Room 33305, Cambridge, MA 02139; email:
MissyC@mit.edu. An earlier version
(Continued on p. 41.)
|
33
Author Information
L. Jean Camp is Associate Professor of Informatics at Indiana University, 901 East 10th St., Bloomington, IN 47408-3912; email:
ljeanc@gmail.com. An earlier
version of this paper was presented at ISTAS03, Amsterdam, The
Netherlands.
References
[1] LA Times Editors, No Fly list traps innocent, LA Times, Apr. 14, 2004.
[2] E. Alderman and C. Kennedy, The Right to
Privacy. New York, NY: Knopf, 1995.
[3] R.E. Anderson, D.G. Johnson, D. Gotterbarn, and J. Perrolle, Using the ACM Code of
Ethics in decision making, Commun. ACM,
vol. 36, pp. 98-107, 1993.
[4] T. Aslam, I. Krsul, and E. H. Spafford, A
taxonomy of security vulnerabilities, in Proc.
19th Nat. Information Systems Security Conf.,
Baltimore, MD, Oct. 6, 1996, pp. 551-560.
[5] K. Bowyer, Face recognition technology:
Security versus privacy, IEEE Technology &
Society Mag., vol 23, no. 1, pp. 9-19, 2004.
[6] L. Camp, Trust and Risk in Internet Commerce. Cambridge MA: M.I.T. Press, 1999.
Creating Moral Buffers in Weapon Control Interface Design (Continued from page 33)
References
FALL 2004
41