Você está na página 1de 38

euRobotics

The European Robotics Coordination Action


Grant Agreement Number: 248552
01.01.2010 31.12.2012
Instrument: Coordination and Support Action (CSA)



D3.2.1 Ethical Legal and Societal
issues in robotics

Authors
Christophe Leroux (CEA LIST)
Roberto Labruto (ALENIA AERMACCHI)




Lead contractor for this deliverable: CEA LIST
Due date of deliverable: December 31, 2012
Actual submission date: December 31, 2012
Dissemination level: Public
Revision: 1.0

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 2 of 38
Executive summary
This document is a contribution to provide some elements to understand and address the Ethical,
Legal and Societal issues (ELS) in robotics.
This document Ethical and Legal and Societal issues in robotics together with the annex document
suggestion for a green paper on legal issues in robotics constitutes the deliverable D3.2.1 of
euRobotics project
The report represents the result of the effort undertaken in the project euRobotics [14] on ELS issues
hindering the development of robotics in Europe. It includes sets of suggestions and a roadmap on
ethical, legal and societal issues in robotics. The document does not explore all ELS issues. It is
meant to stimulate and organize a debate on this topic.
This document can also be taken as a guide book for robotics people to know basics on ethical, legal
and societal issues in robotics as well as for, philosophers, lawyers, and people interested in societal
matters as a reference to matters that concern robotics and its development in Europe.
One major contribution of this document is the bringing together result of work already made in the
domain in the past. Some specific additional effort was undertaken on legal issues in robotics which
resulted in a suggestion for a green paper on legal issues in robotics available in a separate
document.
This deliverable focuses on analysing the issues specific to robotics. It makes a distinctive
analysis of Ethical, Legal and Societal issues in robotics. We propose roadmaps and actions
addressing specific issues and targeting specific communities: jurists, politicians, experts in social
sciences, and robotics stakeholders. We also chose a top down approach starting from the concepts
of Ethical, Legal and Societal issues to investigate the problems instead of studying issues one case
study after the other.
In the conclusions, the report recommends presenting clearly what are the values referred to when
analysing Ethical issues. The report recommends relying on values presented in the Fundamental
Charter of Human Rights when examining Ethical issues because this charter is founded on the
indivisible, universal values.
For law issues, we propose some further investigations on IPR, labour law and non-contractual
liability. After explaining the concept of electronic personhood, we also suggest some further
investigation in order to study how this concept could be implemented. Besides the domain dependant
suggestions, we also propose more general tracks like harmonizing European legislation and
regulations in order to facilitate the emergence of robotics in Europe.
We also support the idea of keeping the top down approach when analysing ELS issues in order to
address the widest spectrum of robotics applications.
In its conclusion, the report recommends analysing the possibility to resituate ethical, legal and
societal issues in a larger technological context than robotics in order to provide, as a result,
more impacting solutions and to adopt and rely on already existing solutions. Making a link between
robotics and other technological domains would lead to avoid considering robotics as a unique,
distinctive and strange technology in order to make people trust robots and robotics.
The report finally recommends working at providing guidelines to help robotics developers and
experts analysing ELS issues,

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 3 of 38
Content
1. Introduction ....................................................................................................................................... 5
1.1. What is the purpose of this document? ................................................................................... 5
1.2. Plan of the document ............................................................................................................... 5
2. Background and approach ............................................................................................................... 6
2.1. Some definitions ...................................................................................................................... 6
2.1.1. About Ethics ..................................................................................................................... 6
2.1.2. About societal issues ....................................................................................................... 6
2.1.3. About legal issues............................................................................................................ 7
2.2. Context and past work in the domain ...................................................................................... 7
2.3. Analytical approach to ELS issues in robotics ......................................................................... 8
2.3.1. Top down approach ......................................................................................................... 8
2.3.2. Disjoint analysis of Ethical, Legal and Societal issues .................................................... 8
2.3.3. Issues specific to robotics ................................................................................................ 9
2.4. Limits of this document ............................................................................................................ 9
3. Methodology ................................................................................................................................... 10
4. Ethical issues in robotics ................................................................................................................ 12
4.1. Focus ..................................................................................................................................... 12
4.2. Diversity ................................................................................................................................. 12
4.3. Which Values? ....................................................................................................................... 12
4.4. Ethical issues specific to robotics .......................................................................................... 13
4.4.1. Assistive robotics for elderly or disabled people ........................................................... 13
4.4.2. Security robotics ............................................................................................................ 14
4.4.3. Toy robotics ................................................................................................................... 14
4.4.4. Sexual robotics .............................................................................................................. 14
4.4.5. Human extension, exoskeleton ..................................................................................... 15
4.5. Conclusion on ethical issues ................................................................................................. 15
5. Societal issues ............................................................................................................................... 16
5.1. Background ............................................................................................................................ 16
5.2. Approach ............................................................................................................................... 16
5.2.1. Challenges identified during the workshops .................................................................. 16
5.2.2. Blind surveys ................................................................................................................. 18
5.3. Conclusions ........................................................................................................................... 24
6. Legal issues ................................................................................................................................... 25
7. Conclusions, priorities and suggestions for further proceedings ................................................... 26
8. Appendix A Communication ........................................................................................................ 27
9. Appendix B - Bibliography .............................................................................................................. 28
10. Appendix D Glossary .............................................................................................................. 30
11. Appendix E Experts and specialists that took part in the green paper elaboration ................ 35

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 4 of 38
12. Appendix F List of events and meetings organized on Legal issues in robotics .................... 37
13. Appendix G Legal issues in robotics ...................................................................................... 38


euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 5 of 38
1. Introduction
1.1. What is the purpose of this document?
Who has never faced fears of overpowering robots or questions about the negative impact of robotics
on employment or other worries about the consequences of introduction of robots in the society?
Besides robotics experts, developers, industrials, researchers can be puzzled when trying to promote
their ideas, their products or their research. Providing answer to these questions is not obvious when
we look back at the effort made on these topics. It needs some deep investigation mixing the point of
view of experts from social sciences and robotics. Since robotics is an evolving and very lively science
this effort must be kept in time.
This document includes sets of suggestions and a roadmap on ethical, legal and societal issues in
robotics. It describes the effort undertaken in the project euRobotics (euRobotics coordination action ,
2012) on ethical, legal and societal issues hindering the development of robotics in Europe. The
document does not explore all ethical, legal and societal issues in robotics.
euRobotics [14] is a coordination action supported by the European Commission
1
. The general
objective of this coordination action is to act and find ways to favour the development of European
robotics. One of the task to reach this objective is to identify obstacles hindering the development of
robotics with a specific focus on service robotics and to propose actions facilitating the developments
of robotics activity in Europe in terms of research, development, innovation, market or usage. This
document represents one part of the road mapping effort conducted in euRobotics on Ethical, Legal
and Societal (ELS) issues deterring the development of robotics in Europe. The study focused on
issues specific to robotics. We however try to emphasize the connections between ELS issues in
robotics with ELS issues in other engineering domains in order to provide more impacting
recommendations. We limited our study to European societal frame although we observed with
surveyed the effort made in the domain outside Europe.
This document can also be taken as a guide book for robotics people to get to know the basics on
ELS issues in robotics as well as for lawyers as a reference to matters that concern robotics and its
development in Europe.
This document Ethical and Legal and Societal issues in robotics together with the annex document
suggestion for a green paper on legal issues in robotics constitutes the deliverable D3.2.1 of
euRobotics project.
1.2. Plan of the document
Chapter 1 (current chapter) presents the context
Chapter Fehler! Verweisquelle konnte nicht gefunden werden. frames the problem
Chapter 3 describes the methodology chosen to analyse ELS issues
Chapter 4 presents the effort made on Ethical issues
Chapter 5 analyses societal issues in robotics
Chapter 6 presents the investigation of legal issues.
Chapter 7 concludes the report.
The appendices contain a glossary, the list of people involved in the elaboration of the document, the
publications made, the bibliography, the meetings organized and a visual presentation of the roadmap.
In the following, sections, we describe the methodology adopted to identify ELS issues in robotics and
to propose solutions to overcome these issues.

1
grant agreement number 248552

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 6 of 38
2. Background and approach
In this chapter, we start in section 2.1 by providing a few definitions related to ethics, legal and society
concepts. These definitions are meant to frame the problem and explain the analytical approach
chosen in our study and presented in this report on ELS issues in robotics.
2.1. Some definitions
2.1.1. About Ethics
Ethics
Science of good and evil [33]
It is in the area of ethics to determine what is good or bad [18], [30]
The goal of ethics is to indicate how human beings should behave, act and be towards
others people and towards what surrounds them [7]
Area of ethics
Philosophers divide ethics in domains whose bounds are not always perfectly clear [13], [6].
Meta-ethics: is the study of concepts, of judgements and moral reasoning,
Normative ethics: concerns the elaboration of norms prescribing what is right or
wrong, what must be done or what must not
Applied ethics: application of the two domains above to specific problems (feminism,
environment, biology, professional ethics etc.)
Descriptive ethics: is sometimes added as a separate area. It is the study of peoples
beliefs about morality.
In other words one defines:
Descriptive ethics: What do people think is right?
Normative (prescriptive) ethics: How should people act?
Applied ethics: How do we take moral knowledge and put it into practice?
Meta-ethics: What does 'right' even mean?
Normative ethics
The central debate of normative ethics is the question opposing
Virtue ethics: in which moral evaluation focuses on the inherent character of a person
rather than on specific actions
Deontology: in which moral evaluation carries on the actions according to imperative
norms, to duties
Consequentialism: in which moral evaluation carries on actions and among their
contribution to improve the state of the world. Consequentialism include utilitarianism
which holds that an action is right if it leads to the happiness of the greatest number of
people
Differences between ethics and moral
Some authors make a distinction between ethics and moral with diverse successes [13].
Ethics comes from the Greek and moral comes from the Latin both have the same
signification. We will make no differences between the two terms in this document.
2.1.2. About societal issues
Societal
Pertaining to the society
of or relating to society (Merriam-Webster dictionary)
Social
Of or relating to human society and its modes of organization (Merriam-Webster, The Free
dictionary)

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 7 of 38
Societal responsibility
It is the concept indicating the responsibility of an entity (economic agent, group, community)
relatively to social, sanitary, and environmental consequences of its activity and specifically for
its stakeholders [8].
Societal responsibility relies on two principles:
1. Willing to assume the impact of ones activities and of ones decisions on the
environment and the society
2. Report using credible and transparent indicators
ISO 26000 [8] provides guidance on how businesses and organizations can operate in a
responsible way for society.
2.1.3. About legal issues
Legal
According to law, not in violation of law or anything (Dictionary of law)
Most of the definitions regarding law issues are provided in the proposal for a green paper document.
2.2. Context and past work in the domain
Investigating Ethical, Legal and Societal issues in robotics necessitates interaction between various
domains and disciplines which are traditionally disconnected in teaching, expertise or professionally
such as technique, philosophy, social sciences, laws, and possibly history and religion. This makes
complex the dialog between experts in different domains. We experienced this several time during this
action. One typical example is the difference of interpretation between scientists and lawyers about the
concept of autonomy. For the first ones, an autonomous robot is a machine that makes decisions
according to sensor information in a more or less deterministic way. For lawyers and probably for most
of the public, an autonomous robot makes its own decision according to its (or his) consciousness; a
robot is almost a post-human entity.
Since the birth of the concepts of robots and robotics, it was clear that a machine capable of
undertaking actions in close contact and with direct involvement of humans must be constrained by a
precise set of ethical, legal and societal rules previously established. One of the most famous attempts
to define what are these rules that a robot has to obey is the formulation of the Asimov's famous three
laws of robotics [12], that state (in decreasing priority in case of mutual conflicts): that a robot is not
allowed to do anything that would harm a human being; that a robot should always obey a human; that
a robot should defend itself. Unfortunately, even if these rules can be a good starting point (which is
debatable as Asimovs own robot stories were an exploration of the potential unintended
consequences of such basic rules), they are too abstract, inaccurate and too much relying on sensitive
concepts to be used in practice or to be implemented in actual devices. For this reason, in recent
years, as the technology is becoming more and more developed, a debate has arisen concerning how
the guide lines relating to ELS issues should be practically implemented.
In 2004, during the First International Symposium on Roboethics, Gianmarco Veruggio [1] coined a
new word, Roboethics, meaning a human-centred ethics applied to robotics, able to guide the
design, construction and use of the robots and their interactions with humans. This subject covers
many disciplines such as robotics, computer science, artificial intelligence, philosophy, ethics, biology,
physiology, cognitive science, neurosciences, law, sociology, psychology and industrial design. Some
of the topics treated by Roboethics are: the societal and cultural variations in robotics acceptance,
privacy, the impact of revolutionary technological changes on employment, the ethical implications of
machines that kill or assist in killing, sexual activity and robotics (i.e. robot partners), the professional
and social responsibility, personal and corporate accountability and liability for harm, the human-
machine integration in a shared environment, and the application of precautionary principles.
An important European project related with ELS issues is the FP6 project Ethicbots (that lasted from
2005 to 2007) [17] with the objectives of identifying techno-ethical case-studies on the basis of a state-
of-the-art survey in human-machine integration, identifying and analysing techno-ethical issues
concerning these forms of human-machine integration by reference to case-studies analysis,
establishing a techno-ethically aware community of researchers by promoting workshops,
dissemination, training activities and generating inputs to EU for techno-ethical monitoring, i.e.

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 8 of 38
providing recommendations for EU ethical regulations amendments and stimulating the European
Group on Ethics (EGE) in science and new technologies and other ethical councils.
One of the results obtained within the Ethicbots [17] project is the Triaging Methodological Approach.
Taking into account that there is a huge amount of technologies, projects and systems in the field of
robotics that need an ethical analysis and assessment, it is necessary to prioritize them, discriminating
between the ones that comprise techno-ethical problems that raise concern now and techno-ethical
problems that may be significant in a distant future only. To do that, the triage methodology is based
on the selection criteria of imminence (a useful manner for identifying technologies and projects in the
field of robotics that more urgently call for an ethical assessment), novelty (this criterion is useful to
discriminate among technologies and projects that are not only needy of an ethical assessment, but
that have also the potentiality to cause changes in the life of individuals and social groups) and
potential social pervasiveness (this criterion is useful to discriminate among technologies and projects
that have the potential to cause changes in the life of more extended communities or even human
society as a whole).
In parallel to the above initiatives on ELS issues, several projects like LIREC [9] or SERA [11]
COMPANIONABLE [4], CARE-O-BOT [23], KOMPA from ROBOSOFT [10] SAM and AVISO [28] focus a large
part of their effort on social acceptance of robots bringing together different world like ethology, social
science, design & computer.
2.3. Analytical approach to ELS issues in robotics
The current report is a follow up to the previous studies undertaken on Ethical issues in robotics.
However it proposes a different approach in adopting what we could call an analytical approach to
ELS issues in robotics instead of an empirical approach in the previous work based on case studies.
In adopting a systematic top down approach methodology
In making a disjoint analysis of Ethical issues of Legal issues and of Societal issues in
robotics
In restricting the analysis to issues specific to robotics
The consequence is that the report proposes roadmaps and actions addressing specific issues and
targeting specific communities: jurists, politicians, experts in social sciences, and robotics
stakeholders. Some specific efforts were undertaken on legal issues in robotics which resulted in a
suggestion for a green paper on legal issues in robotics extracted from the deliverable.
2.3.1. Top down approach
The top down approach proposed in this document consists in analysing ELS issues starting from the
concepts of ethics, of laws and of society rather than starting from robotics case studies to understand
ELS issues in robotics. It consists in considering Ethics, Laws and regulations, and current Society
defining some constraints to be respected when doing research, developing, producing and using
robots.
The advantage of the top down approach is that it determines larger impacting issues than those
highlighted when starting from specific case study. For example, privacy issues are general to all
robotics applications dealing with management of personal data (including usage of cameras).
Therefore we need a unique framework to know how to deal with privacy issues in robotics. This is
much more impacting than if we make a bottom up analysis considering assistive robotics for people
suffering from Alzheimer on one hand and protection of personal data for drone in police application.
A top down analysis allows also easier linkage between different technological domains. In the above
example on personal data protection, it is easy to see that data protection issues in robotics are not
different than data protection issues with other technological domains. Therefore if there are
necessities to make an evolution of data protection regulation or legislation, there is no need to put the
accent or even mention robotics.
2.3.2. Disjoint analysis of Ethical, Legal and Societal issues
A disjoint analysis of ethical legal and societal (or social) issues in robotics is interesting and
necessary since issues are referring to different domains of expertise. Answers to these issues imply
different analysis with different methods, different group of experts.

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 9 of 38
Ethical issues: will find answers in principles, duties (e.g. Hippocratic Oath), and eventually
suggestions for regulation, or for new laws (bioethics),
Legal issues will involve experts of different domains of laws, politicians, parliaments, and we
result in directives, regulation, laws
Societal issues necessitate the participation of all stakeholders in a domain. Solutions may
find solutions in actions from the stakeholders. E.g. robotics is a job killer is contradicted by
the recent Martech study [22] for IFR
2
.
Examining Ethical, Legal and Societal issues separately focusses the analysis. It leads to a clear
comprehension and clarification of the questions raised and drives to organize recommendations,
targeting domains of expertise for interventions.
2.3.3. Issues specific to robotics
Analysing Ethical Legal and Societal issues specific to robotics aims at understanding whether the
issues raised are problems occurring because there is a robotic agent or whether these issues are
more general and could be raised for any other technical solution. Following this methodology does
not mean we did not pay attention to general ELS issues. This approach means that we always try to
guess if ELS issues disappear or not when we replace the word robot by device, robotics by
technology, robotics expert by technology expert. In a way we project the domain of robotics into
a wider domain including robotics.
An advantage of this approach is to possibly lead to general and global purpose solutions if it can be
shown that the issue raised concerns other technologies than robotics. One first consequence is to
help raising solutions with a large impacting effect since they include different technologies. Another
consequence is that it leads to think about robotics as an ordinary technology not much different than
other technologies in terms of ELS impact or issues. For example: is usage of a robot feeder and
ethical issue? Some could say it is an attempt to human dignity. However, if we replace robot by
mechanical device or animal the problem stays the same.
An additional advantage is to allow comprehension of the issues. In the above example of mechanical
feeding, we may be facing a societal issue rather than an ethical one. In fact, in recent investigations
[29] people expressed their satisfaction of using mechanical feeder My Spoon because it was
precisely preserving their dignity, avoiding exposing their dependency. Hence the issue raised by
usage of mechanical feeder is much more societal issue than an ethical issue: how can our society
deal with the growth of elderly dependent people? Political decisions are expected.
2.4. Limits of this document
This report deals with short or mid-term visions of robotics. We excluded the case studies related to
futuristic visions of robotics like post-humans. We also excluded from our study military robotics.
In this document we focus on applied ethics. We concentrate on ethical issues in robotics (technology)
and not on the way ethical issues should be taken into account (normative ethics) nor on the study of
the concepts (meta-ethics).
In English the distinction between social and societal is not always made. In this document we will use
the term societal. Societal issues will refer to the impact on society (e.g. employment, environment,
sustainable development) of robotics in all the aspects of its activity (research, industry, usage).

2
International Federation of Robotics

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 10 of 38
3. Methodology
The methodology followed in the production of this report started with an analysis of the state of the art
followed up by the creation of a task force and concluded by the production of the report constituting
the current deliverable.
State of art analysis
A survey of the European initiatives (e.g. Roboethics, Ethicbots [17], EURON Roboethics roadmap
[37], ELS issues in the SRA [2]), documents and methodologies (e.g. the triaging methodology) so far
developed regarding the connection between robotics and ELS issues in robotics.
Creation of a task force of ELS issues experts
The number of people in this task force was limited in order to be efficient and reactive in the
document production and control. Since the ELS issues involve several disciplines, this task force
included sociologists, lawyers, physicians, rehabilitation technologists, insurance experts and experts
from robotics. First contacts with potential experts have been made and a first meeting has been held.
The roles of this task force of external experts were:
To read the documents produced, suggest modification and provide comments
To provide advices on the methodology
To provide advices on the action to take and choices to make
To evaluate and rank potential case studies
To contribute to the debates organised together with the euRobotics members during meetings
such as the Annual European Robotics Forum meetings
To select case studies and assist in the identification of critical issues
To debate about the case studies within the task force and with the community
To assist in the production of the final report with the proposed roadmap of corrective actions
The list of experts involved is illustrated on the figure below

Figure 1: Expert network
Production of the final report
The final report does not pretend to be an exhaustive presentation of the ELS issues in robotics; the
goal is too vast and unreachable for the three year duration of the project. The objective of this report
is rather to select some representative, actual and particularly limiting matters to propose some
concrete and reasonably accessible solutions and roadmap. The complete list is presented in the
following section.

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 11 of 38
Selection of case studies
Since the duration of the project was limited, the ELS studies focused at the beginning on some case
studies chosen with experts. The analysis of the case studies aimed at circumscribing ELS issues at
an early stage of the project in order to orient discussions, select at best the task force. The case
studies selected at the beginning of the project were the following:
Domestic service robotics
Personal assistive robots
Professional service and security robotics
Robot for co-working
The use cases above were meant to help to focus the effort to elaborate an action plan to overcome
ELS issues with the idea to generalize the analysis made to represent more paradigmatic solutions
and enable to propose a roadmap that can be generalized to a wider set of use cases.
This approach was abandoned for several reasons:
Experts did not agree to select representative case studies
Case study approach appeared to be counterproductive to define general roadmap useful for
robotics as a whole.
Whatever the case study, it appeared that concepts and issues raised could only represent
some narrow and arbitrary matters leaving some questions with no answers and unsatisfying
for a majority of robotics experts (autonomous transport, security robotics, )

Figure 2: methodology

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 12 of 38
4. Ethical issues in robotics
4.1. Focus
This deliverable carries on applied or professional ethics and not in normative ethics. That is, we
focused our interest in right or wrong or just and unjust practice regarding robotics and not in
discussing the norms to follow to respect good practice in robotics. We are interested in discussing
what are the existing ethical issues in robotics and what are the ethical issues specific to robotics.
4.2. Diversity
Every moral theory (virtue ethics, deontology or consequentialism) are promoting values [13]. These
values have changed in time and can be diverse according to culture. Ethics returns to duties we have
regarding ourselves or others which are not dictated by laws [34].
Aristotle considers ethics in a very large perspective originating from the observation that every man
desires a happy life. Moral virtues are states of character lying at the mean between extremes of
excess and deficiency
For Epicure our goal should be to reach a good life and happiness. The right mean consists in
avoiding violence and stay with people having the same ideas than ours. As demanded by our health
and our good relationship with our relatives, we should act with moderation when in search of pleasure
and nothing that does not harm to anybody should be considered as forbidden.
Stoicism compared philosophy to the human body
Logics is the skeleton; it concerns thinking well
Physics is the flesh; it deals with well ordering
Ethics is the soul; it deals with well living
For stoicism ethics, goodness, justice is to be at one owns place. It is to be adjusted with the order of
the cosmos. A just life is a life in harmony with nature in the sense of cosmic order [26, 20].
For ancient Greek ethics was targeting a particular, singular excellence (self-concern), for Christians
behaviours and life styles are supposed to be followed by everybody.
Sartre [35] stated in 1950s that Nothing of the traditional moral can indicate what we have to do;
there is no sign in the world; moral principles are always to abstract to indicate the way to take;
everybody is enforced to invent its own moral, his law. Contemporary ethic thinking shows how the
questions of good and justice tend to formulate in a new way in a period where we cannot refer to non-
changing and transcendental moral values [34]. The goal of ethics is to define what is good, starting
from a reflection on effects of our acts according to Clotilde Leguil. Establishing references to define
what is good and what is just is subject to many philosophical arguments which will not be debated in
this document. People interested in a more profound grounding, can refer, for example, to authors like
Michel Levinas (ethics of religious transcendence), Hans Jonas [27] (ethics of technological civilization
forbidding actions that would put in danger future generations [34]) or John Rawls [31] (theory of
justice, according to who and action can be considered as good if it drives to increase the biggest
happiness for the biggest number of people [34]).
4.3. Which Values?
Thinking of what is good or just causes one to think about the values making the references of ethical
analysis. As we have seen above these values differ a lot according to the cultural context we are in.
In our analysis, we propose is to rely on the values presented in the Charter of Fundamental Rights of
the European Union which result from the constitutional traditions and international obligations
common to the Member States, the Treaty on European Union, the Community Treaties, the European
Convention for the Protection of Human Rights and Fundamental Freedoms, the Social Charters
adopted by the Community and by the Council of Europe and the case-law of the Court of Justice of
the European Communities and of the European Court of Human Rights.
The charter constitutes a set of universal values of human dignity, freedom, equality and solidarity; it is
based on the principles of democracy and the rule of law which are the result European Union spiritual
and moral heritage. It has no constraining value.

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 13 of 38
The Charter contains seven titles. The first six titles deal with substantive rights: dignity, freedoms,
equality, solidarity, citizens' rights and justice. The last title deals with the interpretation and application
of the Charter.
The first title, dignity, guarantees the right to life and prohibits torture, slavery, the death
penalty, eugenic practices and production of human clones.
The second title covers liberty, personal integrity, privacy, protection of personal data,
marriage, thought, expression, assembly, education, work, property and asylum.
The third title covers equality before the law, prohibition of all discrimination including on
basis of disability, age and sexual orientation, cultural, religious and linguistic diversity, the
rights of children and the elderly.
The fourth title covers social and workers' rights including the right to fair working
conditions, protection against unjustified dismissal, and access to health care, social and
housing assistance.
The fifth title covers the rights of the EU citizens such as the right to vote in election to the
European Parliament and to move freely within the EU. It also includes several administrative
rights such as a right to good administration, to access documents and to petition the
European Parliament.
The sixth title covers justice issues such as the right to an effective remedy, a fair trial, to the
presumption of innocence, the principle of legality, non-retrospectivity and double jeopardy.
4.4. Ethical issues specific to robotics
Ethical issues in robotics have been extensively analysed in the past and we can find many elements
on the subject in the literature [36, 17, 37] and we will not recall them here. In this report our interest is
about case studies representing ethical issues specific to robotics. It is actually difficult to find some
example of ethical issue really specific to robotics. Most of the issues case studies described also
concern different sector than robotics.
The paragraph below makes a brief and incomplete tour of some common application in robotics
presenting some typical ethical issues.
4.4.1. Assistive robotics for elderly or disabled people
Assistive robots are meant to compensate for deficiencies of elderly or handicapped people (projects
COMPANIONABLE [4], FLORENCE [5] CARE-O-BOT [23], KOMPA from ROBOSOFT [10] SAM and AVISO [28].
In terms of privacy
Assistive robots carry different type of sensors to operate in the environment, avoid obstacles,
navigate and perform actions. These sensors include video or time of flight cameras. It can also be
sensor allowing evaluation of the activity of a person relying on personal health data. Assistive robots
also have capacities to store information acquired from sensors and capacities to communicate
through wireless connections to servers. Combining these capacities drives to wonder about privacy of
people entering the field of action of the robots sensors. Privacy issues in assistive robotics are further
emphasized since these assistive robots are most of the time mobile and moving in the environment.
These capacities make assistive robots, vectors of potentially very intrusive in private life of patients,
of care takers, and of people entering the field of perception of the machine. We frequently hear
expressions of the fears of these capacities perceived as threats for people. These fears and threats
are amplified by the image conveyed by popular movies, series or novels and fancies about robots.
Resituating these privacy issues it is clear that they are not specific to robotics. These issues are
raised in the same way when using cameras for fall detection of elderly people or activity following in
institutions. Issues occur because of usage of sensors that can be used to record information about
people.
In terms of dignity
It is sometimes argued that usage of assistive robotics may endanger human dignity since the
machines could replace human assistance. Here again replacing assistive robot by technical
assistance demonstrates that robots are not the agent raising this ethical matter. It was also shown in
recent studies of the LIREC project [29] that for a large majority of people the my spoon robot is on the
opposite contributing to give them back their dignity providing opportunities to be independent and to
preserve privacy when eating. The same observations were also made after the experimentations

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 14 of 38
conducted in France in rehabilitation centres in AVISO project [28, 32] with a large set of severely
handicapped people. Patients always foresee the robotic assistant SAM as a means to get some
privacy and independence.
4.4.2. Security robotics
This kind of robots has to operate in scenarios involving different kind of agents, including human
beings, sensors and robotic platforms. In particular they assist human workers in their activities as (or
nearly as) a real co-worker, with also the advantages they do not get tired, do not know carelessness
or fear, they can be used (instead of human beings) for activities and tasks that are hazardous, boring
and exhausting. Security robots can assist in the protection of a home, a site (e.g. an industrial site),
an infrastructure or a country borders. They can operate alone or as group (as independent entities or
as a swarm); they have to accomplish many heterogeneous tasks such as, first of all, gathering
information in order to perform a situation awareness and then making decisions in terms of doing
coherent and adequate actions.
In terms of privacy
Security robots, during their missions, of course have to face privacy issue strictly related to the large
amount of the data collected, where also recognizable people are involved. However, these issues are
comparable to those impacting existing security systems with data storage capabilities.
Recently, bodies of the European Parliament have discussed the issue of automatic security systems
and have underlined the necessity for future further action and regulation for privacy-related issues in
relation to machine vision systems as technology develops (e.g. of automated algorithmic surveillance
such as facial recognition or intelligent scene monitoring).

4.4.3. Toy robotics
Toy robots have already been on the market for several years. Toy robots can contribute to the
promotion and development of robotics. They can be elements helping to carry a good image of
robotics and make the public know more about robots.
In terms of privacy
Toy robots raise the same matters than assistive robot when they possess sensors recording children.
As shown above this is not specific to robotics since this issue arises with all sensors even if not
installed on a robot.
In terms of equality
Toy robots raise the question of equity: most of the machines are expensive and purchasing is not
possible to all children. There is a risk of segregation in the population between children whose
parents are be able to buy robots and children whose parent cannot afford purchasing these costly
machines. Access to knowledge and control of robot at an early age will be different due to the
payment capacities of parents. This will certainly have consequences on the education and
professional orientation. Toy robots could be a way to create barriers in the population in the same
way than computers and Internet generated the digital divide. However, while it is important to pay
attention that robotics is spread into the widest part of population, this equality issue raised by toy
robot is not specific to robotics. There have always been expensive toys that a large part of the
population could not offer to pay to their children: toy cars, computers, electronic games.
4.4.4. Sexual robotics
Creators of sex robots apply technological advances in artificial intelligence to produce and market
sexual partners for consumers [24].
In terms of dignity
Sinziana Gutiu, in the 2012 We Robot conference explored the dignity issue linked to the usage of
sex robots in the context of relationships between men and women. The paper argues that sex robots
will foster antisocial behaviour in users and promote the idea that women are ever-consenting beings,
leading to diminished consent in male-female sexual interaction. She observes that the negative
gender stereotypes that women face are reproduced in the mere existence of female robots,
comparing the comments made on male robot like Geminoid and female robot like Repliee Q2.
Sinziana Gutiu points out that the harm caused by sex robots goes beyond pornography: because sex

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 15 of 38
robot interactions are complete physical and emotional encounter experiences, and because, the
harm in pornography is based on the type of content in the material, while sex robot harm is triggered
by its very use. Sinziana Gutiu does not highlight differences between robotics and artificial devices or
technical devices, which would shape the aspect of a human without being a robot. Her observations
could however well be applied to woman like artefact. Thus we also can observe that in that case the
ethical issue is addressing much more domains than robotics.
4.4.5. Human extension, exoskeleton
Ethical issues with exoskeleton or robotic-based human extension appear here:
In terms of equality
When people want to use the extra capacities provided by the machine in a competition. Is it ethic to
let people with exoskeleton compete with able people? The recent example of Oscar Pistorius using
passive devices running at the Olympics with able people leads us to think that this issue is not
specific to robotics.
In terms of dignity
People suffering from Body Integrity Identity Disorder (BIID) [25] (that is people willing to replace a valid
part of their body because they are convinced it is a source of illness, of disabilities, of suffering (self-
amputees)) are convinced that they would not feel pain if their member was replaced by an electronic
device that could possibly extend their capacities. In the same idea Trans-humanism (abbreviated as
H+ or h+), is an international intellectual and cultural movement that affirms the possibility and
desirability of fundamentally transforming the human condition by developing and making widely
available technologies to eliminate aging and to greatly enhance human intellectual, physical, and
psychological capacities [16]. Abolitionists propose paradise engineering, i,e., the use of technologies
like psycho-pharmaceuticals and genetic engineering to eliminate even the possibility of painful
sensations and emotions. Extropianism (advocating a proactive approach to human evolution),
immortalism (a moral ideology based upon the belief that technological immortality is possible and
desirable), postgenderism (a social philosophy which seeks the voluntary elimination of gender in the
human species), singularitarianism (a moral ideology based upon the belief that a technological
singularity is possible) are other examples of trans-humanism involving usage technology for human
enhancing.
These practices or pathologies are not far away from current practices in body modification like
surgical augmentation, aesthetic surgery, tattooing, piercing. Hence these issues are not be specific to
robotics because of the usage of robotics devices here or there.
4.5. Conclusion on ethical issues
The difficulty to find examples of ethical issues in robotics does not mean there are no ethical
issues in robotics. It simply means that, after all, robotics is not so different from any other
sciences. Many issues seen as specific to robotics can and in some cases should be analysed in the
more general perspective of relationship between human and science.
Focusing on ethical issues as soon as we envisage using a robot for an application contributes to
carry a wrong image of robots. It conveys the idea that there are actually specific ethical issues in
robotics that would not be if we were to use a device.
A consequence is that when studying ethical issue in robotics, one recommendation, in order to avoid
stigmatization of robotics, is to preconize analysing whether the issue exists with some other
technology. In that case it is an advantage to investigate the solutions chosen to answer these issues
in a perspective addressing technologies in general.
The analysis in this report conducts to recommend defining clearly what are the values referred to
when analysing ethical issues. The report recommends relying on values presented in the
Fundamental Charter of Human Rights when examining Ethical issues because this charter is
founded on the indivisible, universal values.
We also recommend to possibly resituating the ethical issues in a larger technological context,
providing as a result, more impacting solutions to raised issues.
We also preconize providing guidelines to help robotics developers and experts analysing ethical
issues,

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 16 of 38
5. Societal issues
5.1. Background
Societal implications and constraints are frequently under-estimated by the scientific or industrial
robotics communities. This is probably largely due to people involved in robotics activities lacking
knowledge on the way to address these issues. On the other hand, in the public many people are
uncertain about robots and do not see the need or real benefit of having robots besides us at home, in
the streets, or at work. To a large extent the public has often a wrong image of robotics, mostly coming
from science fiction books or movies (I Robot, Blade Runner, Star Wars, Forbidden Planet, etc). These
fictions and novels carry the image that robots are capable of many more things than they can actually
do. This can be a handicap when the public faces real robots, which do not act as expected. Some
people think robots are capable of overpowering Mankind (like HAL in 2001 a Space Odyssey). The
robotics community needs to make efforts to better communicate to the public the reality of robots and
what they can be useful for. If this effort is not made, it can even, in the worst-case, lead to a rejection
of the technology and to a movement that seriously hinders developments in robotics. It must be
avoided that robotics makes the same mistakes and hence hits the same barriers as genetically
modified food did.
5.2. Approach
To circumscribe societal issues two actions were conducted
A dialog about societal issues with the robotics community. The elements were essentially
collected during the workshops of San Sebastian and the workshop of Vasteras organized
during the European Robotics Forums (section 5.2.1)
A survey of the public opinion on some typical robotics matters (section 5.2.2)
5.2.1. Challenges identified during the workshops
The workshops organized by the project allowed to identify several societal issues in robotics and
challenges to address to overcome these identified societal issues.
Challenge 1: Awareness of the impact of ELS issues on research in robotics
The observation
There is a general lack of interest of students, particularly at PhD level, about ELS issues. Students do
not measure the impact of the ELS issues on robotics or research and development.
This was mentioned by several participants of the ELS issues workshop that took place during the
EURON / EUROP Annual meeting 2010 [3]
The reason
This seems to come mainly from a lack of information on the potential impact of ELS issues.
A solution
Provide an outline on the ELS issues and robotics. Describe some actual constraints regarding ELS
issues regarding robotics research and applications, the key actors, etc.
How? It could be undertaken through special courses during Master or PhD studies, and information
should also be easily accessible (website). The way this information should be provided was not
discussed because of the limited time for the workshop. Propositions will be made in other
symposiums and this question may be sent to a mailing list.
Challenge 2: Provide trust for domestic service robotics
The observation
Lack of trust is the main reason why people would not want, would not need or would reject a robot in
a domestic application.
The reasons

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 17 of 38
Lack of trust can arise for various reasons, which were discussed within the working group. These
reasons each represent several technical challenges:
Privacy: how can people ensure that a robot preserves necessary privacy?
Safety: Physical interaction between humans and robots represents a major challenge if the robot
is also capable of undertaking significant physical tasks. We also need better methods for
eliminating safety related failures and for ensuring no erratic behaviour
Robustness: how can we make sure that the robot behaves in an appropriate way whatever the
circumstances? For example simple changes in lighting conditions could currently cause
significantly divergent behaviour.
Security: certify the robot does not bump into you, does not damage the furniture, and does not
damage itself
Data protection: how can one make sure that important data will not be destroyed?
Usefulness
Some solutions
Solutions to overcome these challenges are vast and various. They depend largely on the technology
chosen and cannot be detailed in this paper.
Some psychological considerations might also affect the perceived level of reliance granted of a robot.
Mike Walters said that, according to his organisations experience [19] people are more confident with
big robots than with smaller ones, they tend to consider small robots like children and bigger ones like
adults. Humanoid robots are perceived as more expert than non-zoomorphic robots.
Challenge 3: Engender trust for cooperative robots
The observation
People do not trust working beside a powerful and very fast machine.
The reasons
The problem is different than with domestic robots since trust concerns fewer technical challenges.
Some solutions
Safety: reduce the number of failure, ensure no erratic behaviour
Robustness: how can we make sure that the robot behaves the same way whatever the
circumstances? Changes in lighting conditions
Security: certify the robot does not bump into you, does not damage nearby equipment, and does
not damage itself
Challenge 4: Make the certification of robotic devices and machines simpler and quicker
The observation
In Europe the certification dissipates time and effort in every country.
The reasons
Laws and regulations are different among the European countries and this lack of homogeneity for
certification organisations in Europe implies a multiplicity of rules that hinder the development of
service robotics. Compared to the US, where the legislation across the states is common and reduced,
the European legislation appears to be quite burdensome, obstructing innovation. Japan, US and
Korea all appear to have much lighter regulations than Europe. Legislation should be a tool helping
the innovation and should define clear rules to follow. It should anticipate or closely follow the
emergence of some technologies and not just wait for the technologies to appear before saying they
are not legal.
Some solutions
Increase the responsibility of the people programming the robots. Having people, with a
degree, giving the approval could be a solution for some applications (co-working). It would not apply

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 18 of 38
to all applications though (for instance there are moves to reduce the IT burden of automation in the
food industry to increase take-up). A comparison is made with people operating cranes.
Foster the certification of equipment including software. Use proof of program or other
methods to certify software behaviour in any circumstances.


5.2.2. Blind surveys
During the ELS issues workshop that took place during the EURON / EUROP Annual meeting 2010
[3], in order to open the discussion and ask for the audience opinion, a questionnaire was prepared to
let people freely express their views on this topic. The questionnaires, which covered challenges,
hindering gaps, and constraints of each product vision of the five sectors of the EUROP Strategic
Research Agenda, as well as some general issues, were distributed at the beginning of the workshop
and collected at the end.
The majority of the people who answered were working in a research industrial or academic
organization; the professional fields represented were mainly service and security, aeronautics and
industry in general. Some of the questions of the first section of the questionnaire were:
Have your reference customers or users mandatory demands regarding ELS impacts?
Do you consider the respect of the ELS issues a major enabler for your applications or
products?
Do you consider the respect of the ELS issues an obstacle for your applications or products?
In your opinion, which specific technologies or components would be most impacted by the
ELS issues? Try to provide examples of constraints that limits the applicability of those
technologies
Most of the people are dealing with ELS issues during their working activities and think ELS are
enablers and not obstacles to their own applications or products.
The high cognitive technologies, especially when they interact directly with the human beings, are the
most linked with ELS aspects.
All the service and security applications identified at that moment (surgical robotics, assistive robotics
for the elderly, handicapped, aerial surveillance, professional cooperative robotics, disaster
management, inspection of people and goods) are considered needy of an ELS assessment,
especially the assistive robotics for elderly and handicapped and the aerial surveillance.

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 19 of 38
Many useful elements were gathered in the second section of the questionnaire, where the
participants were asked to identify the challenges, hindering gaps and constraints for all the five
robotics segments (Industrial, Professional Service, Domestic Service, Security and Space). Particular
highlights emerging from this section were safety, social acceptance, trust, privacy, data integrity and
security protection, legal impact and responsibilities, and actions that a robot is allowed to undertake
without human supervision.
In the last section of the questionnaire, some general questions were proposed:
Is ethics applied to robotics a problem for the individual scientist / engineer, the end user or
concerned third parties?
Is it a social problem to be addressed at institutional level?
How far can we go in embodying ethics in a robot?
Which type of ethics is the correct one for robotics?
How contradictory is, on one hand, the need to implement ethics in robot and, on the other
hand, the development of robot autonomy?
Who is responsible for actions carried out by human-robot hybrid teams?
Should one enforce a human-in-the-control-loop requirement without exception?
Is it necessary to assess ELS implications for the non-ready, i.e. low Technology Readiness
Levels (TRL, the maturity level of a technology) technologies too?
How does the methodological approach regarding the ELS issues change in relation with
TRL?
Analysing the results obtained, it is possible to note that ethics applied to robotics needs to be
addressed up to the institutional level; in the relation between autonomy and ethics, participants say
there are no contradictions, keeping in mind that safety and certification cannot be left out of
consideration and the responsibility of the autonomous machines behaviours belongs to the machines
designers / programmers. With regard to the Technology Readiness Levels and ELS, it comes out
that, in general, the ELS implications have greater immediacy and impact for the higher TRL values,
i.e. high maturation levels.
During the euRobotics Forum 2011 that took place in Vasteras, Sweden another questionnaire was
distributed (an online version was also realized). The information collected, that were more practical
can be summarized in the following way:
The majority of the industry customers still disregards the ELS impact of the products/technologies


ELS issues are seen as an OBSTACLE for the introduction of robotic devices in the market

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 20 of 38

It is necessary to assess ELS implications (like for instance the EMS issues) as soon as possible also
for the non-ready (i.e. low Technology Readiness Levels, TRL) technologies too, in order to keep into
account these issues as soon as possible, avoiding problems that it will be harder to solve when the
product/technology it is nearly read to be commercialized.
What is missing now is a public debate and agreement on how the future users (i.e. the general
public!) want autonomous robotic products to be designed considering their impact on human-human
social co-existence, not only the robot-human individual relation and this thing should start at a very
early stage.
If technology, independently of TRL, is used in real applications where ELS issues are relevant, ELS
issues should be taken into account with appropriate methodologies.
With regards to the famous Asimovs rules and the possible relevance for robots being besides a
human, the participants in general think they can be seen as a starting point and an inspiration for
dressing up future legal codes, but also very difficult to apply them and exceptions could be
dangerous; they are ill-posed and incomplete, and it easy to find paradoxical situations, where no way
out is viable; they must be adapted in a more actual way.
Currently, a relevant percentage of participants own a robot at home.

Almost all the people are willing to accept a robot in at home.

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 21 of 38


Functions that a robot at home can have:
AnswerOptions
Rating
Average
GeneralCleaning? 4,65
Tidyingtheplace? 4,53
Dothecooking? 2,81
Security&Guarding? 3,53
Helpyouforperformphysicalexercise? 2,53
Helpyouforperformcognitiveexercise? 2,06
Helptolookafterthechildren? 2,60
Helptolookafteradependant(senile,Alzheimer,
quadriplegic,handicapped)person?
3,65
Sex 1,86
AnimallikePet 1,93
HumanlikeCompanion 2,00

Other suggestions include gardening, robots increasing and facilitating the interaction between
humans, robots being 'social catalysers', that are not consuming human attention (like ICT frequently
today...mobile phones, Internet, etc.) but become socially 'transparent' media to other people; help
carrying loads (goods in supermarket), cleaning up, doing dump and dull work.
Another point is the price that people can accept to pay: in general the participant would accept to pay
up to thousands of Euros if the robot is able to automatically clean and/or cook / do gardening as a
person does and even more if the robot can also patrol the home. Of course, adequate levels of safety
/ trust reliability are requested as well.
Differently from Robot at home, only a strict majority of people would accept robots at their working
place, and only to accomplish dangerous and tedious tasks, basic physical jobs, food delivery and tidy
the place.

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 22 of 38


In general, the working robots should be mainly used in the following scenarios: Surveillance, Search
and rescue, Fire fighting, Intervention in hazardous environment
Most of people would like to see greater use of robots in the society

but thinks that robots will be pervasive in the society only in 10 or 20+ years.
The main fears (and challenges) that are listed by the participants can be summarized in the following
way:
Safety
Trust & Reliability
Security (fears of robot software viruses)
Loss of human responsibility, loss of diversity, loss of multi-dimensional thought
Risk of making people lazier
Concerning the aspect that a robot should have, interestingly, people say it is not so important and/or
they should appear as machines and not as humanoids or pets.

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 23 of 38


Most people would have no problem to be helped and assited by a robot in case he/she became
physically or mentally impaired, so without dignity issues (on the contrary, maybe a robot assistant can
be accepted in a easier way).


Other general comments taken from the audience can be summarized in the following bullets:
A solid legal regulation, clearly defining the responsibilities is absolutely necessary
(e.g. behaviour and priorities in unforeseen situations, what happens regarding
injuries caused by the robot to human)
It necessary to define the right design principles based on a common and responsible
vision of what the robotic society should be for humans
The ethical issue regarding the education of young people to use robotics

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 24 of 38
Personal Robots & Robot Companions: "Insincere" emotional attachment;
Replacement of human social contact by robots
Personal Robots: Replacement of nursing aids, careers,
Personal Robots: Necessity to present the robot as being complementary to the
human assistance
Robot Companions: Fake image of reality of human contact increasing isolation
Autonomous vehicles: Safety; Certification of vehicles; Insurance; Black box to
recorder data
Security: Can a robot wound / hurt a person (even a criminal)?
Security Behaviour and priorities in unforeseen situation
Security: Data protection for surveillance data; Alert or intervene decisions; Dual use
of technology in systems with offensive capability
5.3. Conclusions

Societal challenges are probably the most hindering obstacles for the development robotics amongst
the ELS issues. Societal issues in robotics are scarcely addressed and the opinion we hear about
robotics in the public carry a lot of wrong ideas (robotics cut jobs off) and fake images (robots will over
power humans). Efficient actions should be undertaken to go against the wrong image carried out by
robotics in the public. Martech study for IFR about job creation is one emblematic example of actions
to be undertaken. Regular communication to the public, through for example the European Robotics
Week organized by euRobotics about what robotics is, what are the capacities of a robot and the
services it can bring is another example. Regular contacts between sociologist, technician should also
be encouraged to make the social science community and robotics community learn to know each
other and carry the same message. These efforts need to be made through all European countries to
take into account the diversity of apprehension of robotics. The impact of robotics in the society and
the issues raised should also be resituated in a general technological context in order to avoid
considering robotics as a weird technology and rather some evolution of the current technologies.
Regular surveys could also be a way to understand the vision, the expectations the evolution of public
opinion about robotics.

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 25 of 38
6. Legal issues
Very little information is available on Legal issues in service robotics for the moment. Besides
Asimovs laws, it seems that the interest on this subject has only started recently. Currently the legal
impact of robotics is primarily a matter of certification. The KUKA Robocoaster safety for example, is
guaranteed by certification from the German technical inspectorate TV in accordance with EN 13814
/ DIN 4112 (a standard for the safety of fairground and amusement park machinery and structures).
The lack of a legal framework for robotics can increase dramatically the time to market. There is huge
field of investigation on Legal issues in robotics.
The part on legal issues of this report is annexed to this document in the suggestion for a green
paper. Two publications were made on this subject during the project: [15, 21]. Concluding elements
are recalled below.

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 26 of 38
7. Conclusions, priorities and suggestions for further
proceedings
In this document, we presented the results undertaken in the project euRobotics (euRobotics
coordination action, 2012) on ethical, legal and societal issues hindering the development of robotics
in Europe. This document can be taken as a guidebook for robotics community to know basics on
ethical, legal and societal issues in robotics as well as for lawyers as a reference to matters that
concern robotics and its development in Europe.
The specific efforts on legal issues in robotics resulted in a suggestion for a green paper on legal
issues in robotics extracted from this deliverable to constitute a separate document.
This document had a special focus on ELS issues specific to robotics.
In the conclusions, we recommend relying on values presented in the Fundamental Charter of
Human Rights when examining Ethical issues. We also recommend clearly specifying the ethical
issues addressed when analysing ethical issues on specific case studies.
We also support the idea of keeping the top down approach when analysing ELS issues in order to
address the widest spectrum of robotics applications; that is to start from the ELS concepts as a
framework to analyse issues.
We also recommend analysing the possibility to resituate ethical, legal and societal issues in a
larger technological context than robotics in order to provide, as a result, more impacting solutions
or to use already existing solutions. Making a link between robotics and other technological domains
would lead to avoid considering robotics as a unique, distinctive and strange technology in
order to make people trust robots and robotics.
For future work, we preconize providing comprehensible guidelines and a general methodology to
help robotics developers and experts analysing ELS issues to make lively the ELS issues in robotics.
For law issues, we propose in the green paper suggestion some further investigations on IPR, labour
law and non-contractual liability and on the concept of electronic personhood. Besides these specific
and domain dependant suggestions, we also propose more general tracks like harmonizing
European legislation and regulations in order to facilitate the emergence of robotics in Europe.


euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 27 of 38
8. Appendix A Communication
Bischoff, R., Pegmann, G., Leroux, C., Labruto, R., & al. (2010). euRobotics Shaping the future of
European robotics. ISR/ROBOTIK. San Francisco.
Boscarato, C., Caroleo, F., Labruto, R., Leroux, C., & Santosuosso , A. (2012). Robots, market and
civil liability: a European perspective. 21st IEEE International Symposium on Robot and
Human Interactive Communication. Paris.
Gnther, J.-P., Muench, F., Beck, S. B., Loeffler, S., Leroux, C., & Labruto , R. (2012). Issues of
Privacy and Electronic Personhood in Robotics. 21st IEEE International Symposium on Robot
and Human Interactive Communication. Paris.


euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 28 of 38
9. Appendix B - Bibliography

[1] First international symposium on roboethics. San Remo, jan 2004.
[2] Strategic research agenda for robotics (sra 2009). http://www.robotics-
platform.eu/cms/index.php?idcat=26, jul 2009.
[3] Europ/euron annual meeting, san sebastian. http://www.euron-europ-2010.eu/en/home, mar
2010.
[4] Companionable project. http://www.companionable.net/, dec 2012.
[5] The florence project. http://www.florence-project.eu/, dec 2012.
[6] http://en.wikipedia.org/wiki/ethics, dec 2012.
[7] http://fr.wikipedia.org/wiki/thique, dec 2012.
[8] Iso 26000 guidance standard, dec 2012.
[9] Lirec project. http://lirec.eu/project, dec 2012.
[10] Robosoft web site. http://www.robosoft.com/eng/, dec 2012.
[11] Sera project. http://project-sera.eu/, dec 2012.
[12] Isaac Asimov. I, Robot. 1950.
[13] Jean-Cassien Billier. Introduction lthique. Presses Universitaires de France, 2010.
[14] Rainer Bischoff, Pegmann Geoff, Christophe Leroux, Roberto Labruto, and al. eurobotics
shaping the future of european robotics. In ISR/ROBOTIK, San Francisco, 2010.
[15] Chiara Boscarato, Franco Caroleo, Roberto Labruto, Christophe Leroux, and Amedeo
Santosuosso. Robots, market and civil liability: a european perspective. In 21st IEEE International
Symposium on Robot and Human Interactive Communication, Paris, September 2012.
[16] Nick Bostrom. A history of transhumanist thought. In Journal of Evolution and Technology,
2005.
[17] Luca Botturi, Rafael Capurro, Edoardo Datteri, Francesco Donnarumma, Mark Gasson,
Satinder Gill, Alessandro Giordani, Cecilia Laschi, Federica Lucivero, Pericle Salvini, Matteo Santoro,
Guglielmo Tamburrini, Kevin Warwick, and Jutta Weber. D5: Techno-ethical case-studies in robotics,
bionics, and related ai agent technologies. Technical report, 2005.
[18] Thomas Cathcart and Daniel Klein. Plato and a Platypus Walk into a Bar - Understanding
Philosophy Through Jokes. Penguin Group, 2007.
[19] Kerstin Dautenhahn, Chrystopher L. Nehaniv, Michael L. Walters, Ben Robins, Hatice Kose-
Bagci, N. Assif Mirza, and Mike Blow. Kaspar a minimally expressive humanoid robot for human-
robot interaction research. Applied Bionics and Biomechanics, 6(3-4):369397, 2009.
[20] Luc Ferry and Lucien Jerphagnon. La Tentation du christianisme. 2009.
[21] Jan-Philipp Gnther, Florian Muench, Susanne Beatrix Beck, Severin Loeffler, Christophe
Leroux, and Roberto Labruto. Issues of privacy and electronic personhood in robotics. In 21st IEEE
International Symposium on Robot and Human Interactive Communication, Paris, September 2012.
[22] Peter Gorle and Andrew Clive. Positive impact of industrial robots on employment. Technical
report, International Federation of Robotics, 2011.
[23] Birgit Graf, Matthias Hans, and Rolf D. Schraft. Care-o-bot iidevelopment of a next generation
robotic home assistant. Auton. Robots, 16(2):193205, March 2004.
[24] Sinziana Gutiu. Sex robots and roboticization of consent. In We Robot, apr 2012.
[25] Ian Hacking. Lamputisme : nouveau ftichisme ou nouveau mode de vie ?, 2004.
[26] Lucien Jerphagnon. Histoire de la pense - DHomre Jeanne dArc. 2004.

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 29 of 38
[27] Hans Jonas. The imperative of responsibility: in search of an ethics for the technological age.
1985.
[28] Isabelle Laffont, Nicolas Biard, Grard Chalubert, Laurent Delahoche, Bruno Marhic,
Franois C. Boyer, and Christophe Leroux. Evaluation of a graphic interface to control a robotic
grasping arm: A multicenter study. Archives of Physical Medicine and Rehabilitation, 90(10):1740
1748, 2009.
[29] S Nylander, S Ljungblad, and J. Jimenez Villareal. A complementing approach for identifying
ethical issues in care robotics grounding ethics in practical use. In Ro-Man 2012, Paris, sep 2012.
[30] Ruwen Ogien. Lthique aujourdhui - Maximalistes et minimalistes. folio essais. Gallimard,
2007.
[31] John Rawls. A Theory of Justice. 2005.
[32] A. Remazeilles, C. Leroux, and G. Chalubert. Sam: a robotic butler for handicapped people. In
ieee ro-man, The 17th International Symposium on Robot and Human Interactive Communication,
Munich, Germany, 1-3 August 2008.
[33] Le Robert, editor. Le Nouveau petit Robert.
[34] Jacqueline Russ and Clotilde Leguil. La pense thique contemporaine. 3rd edition, may
2008.
[35] Jean-Paul Sartre. Existentialism Is a Humanism. New Haven, 2007.
[36] N Sharkey. The ethical frontiers of robotics. Science, 2008.
[37] Gianmarco Verrugio. Euron ii deliverable dr.1.3: Euron research roadmapv4.1. Technical
report, 2008.


euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 30 of 38
10. Appendix D Glossary
In this section we provide some definitions, as they are commonly used in the technical community.
None of them is carved in stone, and some of them can be easily source of endless discussions.
Let's take them as a way to feel concepts shared in the technical community. Some of the definitions
reported below are derived from documents available from the EUROP site where you can also find
other definitions (http://www.robotics-platform.eu/cms/upload/SRA/Methodology/2009-07-
07_Appendix_to_the_SRA_for_robotics_in_Europe_-_Glossary_-_V1.pdf). Other sources of
definitions are the ISO document ISO 8373:2012 (Robots and robotic devices Vocabulary) as well
as ISO/DIS 13482 (Robots and robotic devices Safety requirements for non-industrial robots
Non-medical personal care robot)
Actuator
An actuator is a device that can produce force or torque that can be used to move parts of a robot.
These can be: pneumatic actuators, electrical motors, etc.
Agent (legally speaking)
An entity able to do, considering its actions in the area of legal responsibility.
Area of ethics
Philosophers divide ethics in domains whose bounds are not always perfectly clear [13], [6].
Meta-ethics: is the study of concepts, of judgements and moral reasoning,
Normative ethics: which concern the elaboration of norms prescribing what is right or
wrong, what must be done or what must not
Applied ethics: application of the two domains above to specific problems (feminism,
environment, biology, professional ethics etc.)
Descriptive ethics: is sometimes added as a separate area, is the study of peoples
beliefs about morality
In other words one defines:
Descriptive ethics: What do people think is right?
Normative (prescriptive) ethics: How should people act?
Applied ethics: How do we take moral knowledge and put it into practice?
Meta-ethics: What does 'right' even mean?
Autonomous robot
An autonomous robot is able to perform a task in a possibly incompletely known environment
without human intervention during the process. In the community, this is often opposed to
programmed robot. Both autonomous and programmed robots are programmed, in the sense
that are controlled by a computer that executes instructions to make them acting, but a
programmed robot is intended to work in a very well known, possibly structured, environment
such as a production cell in a car factory, performing a repetitive well specified set of actions. For
this latter kind of robots, a normative corpus is already available, so we will not consider them: they
usually have to work in places where people are not allowed to enter when they are operating. In
general, they are considered as devices, similarly to milling machines and the like. Most
autonomous robots are programmed robots and this implies a predictable and deterministic
behaviour when facing a given situation described by known values for the inputs. Beside errors in
the program or intentional presence of random actions, the only source of unpredictability resides
in the uncertainty of perception and uncertainty of actuation. For instance, let us assume our
autonomous robot is supposed to grasp a red ball, if it is not able to distinguish red from orange
because of insufficient lighting, then the final outcome might not be correct; at the same time if the
ball is slippery our robot might miss the grasp resulting again in an incorrect (unexpected) result.
Finally, a possible source of unpredictable behaviour (at design time) is the case of learning robots.
Sometimes, an autonomous robot is also called a robotic agent, putting in evidence the ability to
select actions.

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 31 of 38
Behaviour
Robot behaviour is the way of acting of a robot as perceived by an external observer. This can be
described at a low level (e.g., the robot is entering a door) or at high level (e.g.: the robot is
monitoring the health status of its owner). Sometimes it can be described by establishing a
relationship between observed situations and actions. (e.g., the robot avoids obstacles that are on
its path to the goal, or the robot goes to its owner when it is asked to). Often the term behaviour
is also used to refer to the behavioural modules that implement behaviour, i.e. a set of programs
or operational knowledge that, when operative, can implement the behaviour. Often, many
behavioural modules are implemented on a robot and each of them is triggered by a specific
situation.
Care robots
Def. CR1: Care robots are able to support people with disabilities and elderly in improving their way
of living. This includes wheelchairs, feeders, system to move people, robotic arms to bring or get
objects, but also exoskeletons and robots to provide psychological aid.
Def. CR2: Care robots are able to take care of people. This includes also robots able to interact
with people, in general able to understand, or at least manage, the essence of taking care, such
as robotic companions and some robotic pets.
Civil law
It is a legal system originating in Western Europe, within the framework of late Roman law. Its most
relevant feature is that core principles are codified into a referable system which serves as the
primary source of law.
Civil liability
Legal liability which is not in criminal law. This liability may be in contract, tort, restitution, or various
other sub-areas.
Cognitive robot
A cognitive robot is an autonomous robot that exploits processes analogous to cognitive
processes. In particular, this term is referred to robots able to reason.
Common law
It is a legal system originating in UK and mostly used in almost all English-speaking countries. The
intellectual framework comes from judge-made decisional law which gives precedential authority to
prior Court decisions on the principle that it is unfair to treat similar facts differently on different
occasions (doctrine of judicial precedent).
Consumer law
An area of law which regulates relationships between individual consumers and the businesses
that sells those goods and services.
Contractual liability
This kind of civil liability arises when there is any failure to perform an obligation under the contract,
whether or not excused, and includes delayed performance, defective performance and failure to
co-operate in order to give full effect to the contract.
Criminal Law:
The area of law in which it is decided if someone has committed a criminal offence and is criminally
liable for it.
Data privacy law
The area of law that covers the protection of the right to privacy with respect to the processing of
personal data.
Ethics
Science of good and evil [33]
Is in the area of ethics to determine what is good or bad [18]

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 32 of 38
The goal of ethics is to indicate how human beings should behave, act and be towards
others people and towards what surrounds them [7]
European Directive
A legislative act of the European Union, which requires Member States to achieve a particular
result without dictating the means of achieving that result. A Directive normally leaves Member
States with a certain amount of leeway as to the exact rules to be adopted.
European Regulation
A legislative act of the European Union which immediately becomes enforceable as law in all
Member States simultaneously
European Recommendation
A legislative act of the European Union without binding force.
Green paper (Green paper)
A green paper released by the European Commission is a discussion document intended to
stimulate debate and launch a process of consultation, at European level, on a particular topic. A
green paper usually presents a range of ideas and is meant to invite interested individuals or
organizations to contribute views and information. It may be followed by a white paper, an official
set of proposals that is used as a vehicle for their development into law.
See also white paper
Labour Law
Laws which address the legal rights of, and restrictions on, working people.
Learning
Process of modification of the knowledge base of the robot gained through the interaction with the
environment (including people) that may produce a persistent change in the robot behaviour. This
includes learning data, such as a map, and learning behaviours, such as mappings from data to
actions.
Legal
According to law, not in violation of law or anything (Dictionary of law)
Legal Person
An entity possessing legal rights and obligations legal personhood is subscribed to entities by the
legislator (based on practicability and social acceptance)
Negligence
A deficit of taking the required care in a situation in which there is a duty to take care causing injury
to another person.
Non contractual liability
This kind of civil liability arises when an agent intentionally or negligently causes damage due to
the violation of a right which is legally protected regardless the existence of a contract (e.g.
physical integrity).
Normative ethic
The central debate of normative ethics is the question opposing
Virtue ethics: in which moral evaluation focuses on the inherent character of a person
rather than on specific actions
Deontology: in which moral evaluation carries on the actions according to imperative
norms, to duties
Consequentialism: in which moral evaluation carries on actions and among their
contribution to improve the state of the world. Consequentialism include utilitarianism
which hold that an action is right if it leads to the happiness of the greatest number of
people

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 33 of 38
Personal data
Any information relating to an identified or identifiable natural person.
Planning
Planning is the computation and selection of tasks, policies and procedures for goal-directed robot
behaviour. This includes path planning, motion planning, grasp planning, task planning, mission
planning and resource coordination. Planning often requires reasoning. Once a plan is done, its
execution is monitored and eventual problems may lead to re-planning (e.g. when an obstacle is
detected on a planned path). A robot able to represent and acting to achieve a goal is called a
rational (robotic) agent.
Reasoning
Process of modification of the knowledge base of the robot through (logical) manipulation of the
available knowledge. Typically, the available knowledge consists of data, both collected through
sensors and given by people, models (e.g., a model of the world, an ontology, etc.), inferential
knowledge (such as induction or deduction rules), and relationships among pieces of knowledge.
Reasoning elaborates these pieces of knowledge augmenting the knowledge base with the inferred
elements. It can be considered a way to explicit knowledge implicitly contained in the knowledge
base. Among reasoning activities, we may consider planning (see below) and classification.
Sensor
A sensor is a (electronic) device able to detect or measure a physical/chemical quantity and
convert it to a signal. Robots can use these signals as input to their computational activities that will
produce commands to the actuators. They are the means to perceive the situation the robot is
facing and take decisions, i.e., run specific subroutines of their program.
Social
Of or relating to human society and its modes of organization (Merriam-Webster, The Free
dictionary)
Societal
Pertaining to the society
of or relating to society (Merriam-Webster dictionary)
Societal responsibility
It is the concept indicating the responsibility of an entity (economic agent, group, community)
relatively to social, sanitary, and environmental consequences of its activity and specifically for its
stakeholders [8].
Societal responsibility relies on two principles:
1. Willing to assume the impact of ones activities and of ones decisions on the environment
and the society
2. Report using credible and transparent indicators
ISO 26000 [8] provides guidance on how businesses and organizations can operate in a
responsible way for society.
Tele-operated robot
A tele-operated robot is composed of a set of parts moved by engines operated by people through
specific interfaces. The actions of these robots are completely controlled by people, through
interfaces like a joy-stick, or even a smart phone. In the most complex (but still common) cases,
signals from these interfaces are input to a computational unit that translates them to controls for
the actuators. This category includes surgical robots, traditional electrical wheelchairs, tele-
operated toys, as well as some robots mentioned in advertisements as care-robots, such as
people-movers, and personal elevators. Some issues may arise when to implement a given
movement (e.g., a cut in a surgical operation) the robot controller has to take decisions that may
affect movements of its parts not directly tele-controlled (e.g., collisions with the patient body of
external parts of the robot arm.

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 34 of 38
White paper (White paper)
A white paper is an authoritative report or guide that helps readers understand an issue, solve a
problem, or make a decision. White papers are used in two main spheres: government and B2B
marketing.
White papers published by the European Commission are documents containing proposals for
European Union action in a specific area. They sometimes follow a green paper released to launch
a public consultation process.
See also green paper.
Work Safety Law
Area of the law to protect safety, health, and welfare of workers.




euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 35 of 38
11. Appendix E Experts and specialists that took part in
the green paper elaboration
Name Organization Country Function
Beck Susanne University of Wrzburg Germany Assistant Professor of
Law
Belder Lucky University of Utrecht The Netherlands Professor of Laws

Bonarini Andrea University Politecnica de
Milano
Italy Professor in robotics
Boscarato Chiara University of Pavia Italy PhD student in laws
Caroleo Franco University of Pavia Italy PhD student in laws
de Bruin Roeland University of Utrecht The Netherlands PhD Student in laws
Eck Daniel University of Wrzburg Germany PhD Student
Edwards Lilian University Strathclyde,
Glasgow,
United Kingdom Professor of Internet
Law
Fitzi Gregor University of Oldenburg Germany Professor in social
sciences
Guhl Tim Kuka Germany Project Manager
Gnther Jan-Philip University of Wrzburg Germany PhD Student
Hilgendorf Eric University of Wrzburg,
Law Faculty
Germany Professor of law and
legal philosopher,
Dean of law Faculty
Huebert-
Saintot
Corinne CEA List, Robotics Lab France Head of technology
transfer department
Labruto Roberto Alenia Aermacchi Italy Project Manager
Leroux Christophe CEA List, Robotics Lab France Manager ICT & Health
Loeffler

Severin University of Wrzburg Germany PhD Student
Mathias Andreas Philosophy Department,
Lingnan University,
Hong Kong
Hong Kong Senior Teaching
Fellow
Matteucci Matteo University Politecnica de
Milano
Italy Professor in robotics
Moeller Guido Kuka Germany Head of patent and
trade mark at Kuka
Muench Florian University of Wrzburg Germany PhD Student
Pegmann Geoff RUrobots United Kingdom Head of RUrobots
Salvini Pericle Scuola Superiore
SantAnna
Italy Project Manager
Santosuosso Amedeo University of Pavia Italy Professor of Life
Sciences and Law at
the University of
Pavia, judge at the
Court of Milan

euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 36 of 38
Sharkey Amanda University of Sheffield United Kingdom Professor of Artificial
Intelligence and
Robotics

Sharkey Noel University of Sheffield United Kingdom Professor of Artificial
Intelligence and
Robotics
Schafer Burkhard Edinburgh University United Kingdom Professor of
Computational Legal
Theory
Van den Berg Bibi University of Leiden The Netherlands Philosopher of
technology. Professor
of law
Wendel Anne EUnited Germany Administrator
Winfield Alan University of West
England Bristol
United Kingdom Professor of
Electronic
Engineering




euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 37 of 38
12. Appendix F List of events and meetings organized on
Legal issues in robotics
Date Location Purpose
2010-March-10 San
Sebastian,
Spain
European Robotics Forum, Workshop on Ethical, Legal and
Societal issues in robotics introduction of the action to the
robotics community
2011-May-01 Vasteras,
Sweden
European Robotics Forum, Worshop on Ethical, Legal an
Societal issues in robotics Robots in our lives, how and why?
2011-July-22 Wrzburg,
Germany
Agenda for a green paper on legal framework for robotics in
Europe,
2011-November-03 Wrzburg,
Germany
Privacy, Civil and Criminal law issues, case studies
2012 January-14 Pavia, Italy A top down approach on legal issues in robotics, plan for a
proposal on green paper,
2012-March-05 Odense,
Denmark
European Robotics forum, Workshop on ELS issues in robotics -
focus on legal issues in robotics,
2012 September 13 Fontenay-
aux-Roses,
France
Meeting on the elaboration of the green paper


euRobotics D3.2.1 Ethical Legal and Societal issues in robotics Page 38 of 38
13. Appendix G Legal issues in robotics
See the document suggestion for a green paper annexed to this document.

Você também pode gostar