D4.3 Ethical and Societal Analysis - ipatchproject.eu · IPATCH D4.3. 3 Document Summary...

22
IPATCH D4.3. 1 D4.3 Ethical and Societal Analysis Due date of deliverable: 31/01/2015 Actual submission date: 21/03/2015 Organisation name of lead contractor for this deliverable: University of Namur (UoN) Revision: 1.0 Grant Agreement N o : 607567 Project Acronym: IPATCH Project Title: Intelligent Piracy Avoidance using Threat detection and Countermeasure Heuristics Funding Scheme: SEC-2013.2.4-2 Start date of project: 01/04/2014 Duration: 36M Project co-funded by the European Commission within the 7 th Framework Programme (2007-2013) Dissemination Level: PU Public RE Restricted to a group specified by the consortium (including the Commission

Transcript of D4.3 Ethical and Societal Analysis - ipatchproject.eu · IPATCH D4.3. 3 Document Summary...

IPATCH D4.3.

1

D4.3 – Ethical and Societal Analysis

Due date of deliverable: 31/01/2015

Actual submission date: 21/03/2015

Organisation name of lead contractor for this deliverable: University of Namur (UoN)

Revision: 1.0

Grant Agreement No: 607567

Project Acronym: IPATCH

Project Title: Intelligent Piracy Avoidance using Threat detection and Countermeasure Heuristics

Funding Scheme: SEC-2013.2.4-2

Start date of project: 01/04/2014

Duration: 36M

Project co-funded by the European Commission within the 7th Framework Programme (2007-2013)

Dissemination Level:

PU Public

RE Restricted to a group specified by the consortium (including the Commission

IPATCH D4.3.

2

Table of Contents

Executive Summary ................................................................................................................................. 6

1 Introduction ....................................................................................................................................... 7

1.1 The IPATCH Project ................................................................................................................ 7

1.2 Deliverable Overview ............................................................................................................... 7

2 Ethical issues ................................................................................................................................... 8

2.1 What is ethics? A brief overview.............................................................................................. 8

2.2 Mapping ethical resources for IPATCH ................................................................................... 8

2.2.1 Bioethics: an example of applied ethics .............................................................................. 9

2.2.2 The Barcelona declaration policy proposals (1998) ............................................................ 9

2.3 Ethics of ICT, technoethics and ethics of surveillance .......................................................... 11

2.3.1 Technoethics: an inclusive and multidisciplinary ethical approach ................................... 11

2.3.2 European Group on Ethics in Science and New Technologies (EGE) .............................. 11

2.4 A possible tool: Ethical Impact Assessment (EIA)................................................................. 14

3 Ethical analysis ............................................................................................................................... 17

3.1 Is an ethical impact assessment of the IPATCH countermeasures possible? ...................... 17

3.2 Ethical impact assessment adapted to IPATCH.................................................................... 18

4 Conclusion and recommendations ................................................................................................. 21

5 References ..................................................................................................................................... 22

IPATCH D4.3.

3

Document Summary Information

Authors and Contributors

Initials Name Organisation

NG Nathalie Grandjean UoN

Revision History

Revision Date Initials Description

0.1 04/02/2015 NG Initial version

0.2 11/03/2015 NG Final version

1.0 21/03/2015 TC Reviewed for submission

Quality Control

Role Name Date

Peer Review Franck Dumortier 09/03/2015

Work Package Leader Franck Dumortier 09/03/2015

Project Manager Tom Cane 21/03/2015

Security Scrutiny Committee Review

Comments

No sensitive material

Recommended Distribution

Suitable for public dissemination

Date 21/03/2015

IPATCH D4.3.

4

Disclaimer

The content of the publication herein is the sole responsibility of the publishers and it does not

necessarily represent the views expressed by the European Commission or its services.

While the information contained in the documents is believed to be accurate, the authors(s) or any

other participant in the IPATCH consortium make no warranty of any kind with regard to this material

including, but not limited to the implied warranties of merchantability and fitness for a particular

purpose.

Neither the IPATCH Consortium nor any of its members, their officers, employees or agents shall be

responsible or liable in negligence or otherwise howsoever in respect of any inaccuracy or omission

herein.

Without derogating from the generality of the foregoing, neither the IPATCH Consortium nor any of its

members, their officers, employees or agents shall be liable for any direct or indirect or consequential

loss or damage caused by or arising from any information advice or inaccuracy or omission herein.

IPATCH D4.3.

5

List of Tables

Table 1: Classification of the countermeasures .................................................................................... 19

List of Abbreviations

Abbreviation Description

ICT Information and Communication Technology

EGE European Group on Ethics in Science and New Technologies

EIA Ethical Impact Assessment

ELSI Ethical, Legal and Social Issues

IPATCH D4.3.

6

Executive Summary

This document constitutes deliverable D4.3 of the project. One of its original aims was to develop a

quantitative and qualitative scoring of the countermeasures in order to compare the levels of ethical &

social acceptance. This objective is impossible to fulfil in its literal interpretation, due to the nature of

ethics in one hand, and to the context of armed conflict, on the other hand. The nature of ethics is not

to evaluate but to provide critical normativity, grounded on philosophical tradition and based on

specific contexts. Evaluating each countermeasure with an ethical grid is highly risky, because it

opens to the standpoint of the ‘right conduct in war’. The IPATCH case is a good example of the limits

of the Ethical, Legal and Social Issues (ELSI) approach in research projects.

Instead, this study aims to provide a qualitative and critical ethical impact assessment, based on a

number of ethical frameworks described herein. We have proposed a typology that classifies the

countermeasures, taking account of the specific context. Apart from the monitoring technologies, we

classified the countermeasures as the following: non-lethal, less than lethal and lethal. Our

recommendation here is to encourage the use of the non-lethal first, than the less than lethal; and to

discourage the use the lethal ones. By saying that, we want to underline again that ethical

responsibilities are not supposed to be endured by the countermeasures (e.g. technological artefacts)

but by the crew using them. Situations of armed conflict are psychologically hard to cope with; and it

should be the main ethical concern of the IPATCH case.

IPATCH D4.3.

7

1 Introduction

1.1 The IPATCH Project

Funded by the European 7th Framework Programme, the IPATCH project addresses Security Topic

SEC-2013.2.4-2: Non-military protection measures for merchant shipping against piracy. The goal of

the IPATCH project is three-fold:

1. To perform an in-depth analysis of the legal, ethical, economic and societal implications of

existing counter piracy measures.

2. To produce well-founded recommendations to the industry in the form of a manual, extending

and complementing the Best Management Practices document and to support the use and

further development of countermeasures.

3. To develop an on-board automated surveillance and decision support system providing early

detection and classification of piracy threats and supporting the captain and crew in selecting

the most appropriate countermeasures against a given piracy threat.

The analysis performed under (1) will lead to recommendations for the use of countermeasures in a

range of scenarios, structured as a manual (2), and development and implementation of a proactive

surveillance system forming part of the system developed in (3). The situational awareness system will

robustly monitor the area around maritime vessels, providing early warning to crew members if piracy

threats are detected. A low false alarm rate due to environmental or other innocuous events,

combined with high threat detection sensitivity are central ambitions of the project.

To achieve these goals, a multispectral sensor suite comprising both passive and active sensors is

envisaged, i.e., a system based on radar, visual and thermal sensors. The sensor suite will be

complemented with advanced algorithms for information fusion, object detection and classification,

and high level modelling of intent and behaviour analysis. The IPATCH project is strongly user-driven

and demonstration of the developed surveillance system will be conducted in two different maritime

environments.

1.2 Deliverable Overview

According to the Description of Work, one of the aims of D4.3 is to develop “a quantitative and

qualitative scoring of the countermeasures in order to compare the levels of ethical & social

acceptance”. Looking back to the ethical tradition, a quantitative assessment, such as a quantitative

risk assessment, is clearly impossible. It is not possible to produce a quantitative scoring of the

countermeasures. Instead, this study aims to provide a qualitative and critical ethical impact

assessment, based on a number of ethical frameworks described herein.

IPATCH D4.3.

8

2 Ethical issues

2.1 What is ethics? A brief overview

Ethics is a branch of philosophy, specifically a branch of moral philosophy. Ethics and moral theories

have been discussed since Plato and Aristotle. It is a philosophical discipline that is as practical

(action-oriented) as it is normative (debating on the common rules). Its goal resides in discussing how

human beings should behave and act together in a community, and largely, in society. At a more

fundamental level, it is a method by which we categorize our values, reflect upon them and pursue

them.

There are different forms of ethics that are distinguished by their degree of generality. Applied ethics

for example, does not have the degree of generality of ethics, they also differ in their object (such as

bioethics, business ethics and the ethics of ICT) or their basis (which may be the environment,

religion, country specific or social group or ideological system) tradition.

In a common meaning, morals is synonymous with ethics, and refers to a practice which aims to

determine a ‘good’ manner to live in the society with the other human beings (e.g. pursuit of happiness

or virtue). Nevertheless, in philosophy, morals and ethics are distinguished in order to nuance a key

difference. A common distinction resides in defining morals as all norms specific to an individual oor

social group at a particular moment in history; and ethics as a research of ‘good’ thanks to a

conscious and argued reasoning. The term ethics is used generally to describe the theoretical

reflections on the value of the practices and conditions of such practices.

Ethics is also a critical reasoning on the morality of actions. This is the practical side of ethics. For

example, there are some ethical committee within scientific institutions and hospitals. Ethicists and

philosophers are experts in rational reflections with values, moral dilemmas and try to assess the

impact of one decision onto an individual, a social group or the society. In all cases, ethics seeks to

answer the question "How to do the best?"

It is a long and complex story of debates, controversies and consensus. Ethical knowledge is not fixed

and evolves constantly, depending on the cultural and political contexts, and depending also – and

mostly – on the moral/ethical issues raised. It has to be a situated knowledge.

Finally, it is important to point out some terminological clarifications. A common misunderstanding

resides in confusing ethics with morals (as already explained), but also with the law and/or with

deontology. Law defines what is permitted and what is forbidden, based on the legal texts and

jurisprudence. Deontology encompasses all the obligations that professionals agree to comply to

ensure consistent practice. In general, it is called as a ‘code of ethics’, even if it is very far from what is

ethics in a philosophical point of view.

2.2 Mapping ethical resources for IPATCH

There is not a long philosophical tradition of applied ethics about ICT and surveillance topics. The

literature about these precise topics is quite recent. We will consider mainly the major ethical

declarations and opinions delivered by international and European instances, as well as the literature

IPATCH D4.3.

9

(deliverables, papers) published in the similar R&D projects1. So let us focus practically on particular

ethical resources.

2.2.1 Bioethics: an example of applied ethics

Applied ethics about ICT and surveillance is linked with bioethics, which clearly was a source of

inspiration for ethicists. Bioethics was born due to moral problems caused by the advancement of

biomedical science and technology in all spheres of human life. The primary focus is the development

of benchmarks for ethically acceptable actions. In this way, bioethics is an applied ethics. However, it

is an original approach that does not appeal to a unified theory which could be deducted from the

norms and/or, as it occurs in traditional ethical theory.

A very typical bioethics method is a principles-based approach developed by Beauchamp and

Childress, in a book first published in 1983 entitled Principles of Biomedical Ethics2. The book was

reprinted several times and led to the institutionalisation of bioethics. It analyses ethical issues

grounded in medical practice, based on four main principles: beneficence, non-maleficence, respect

for the autonomy and justice. As such, they give only very general information about the morality of

decisions, as follows:

1. The principle of (respect for) autonomy requires respect for the decision-making capacities of

autonomous persons.

2. The principle of non-maleficence imperatives not to inflict harm to others.

3. The principle of beneficence imposes not only be beneficial by asking positive actions, but

still weighing the potential benefits against the harm that may arise, so as to maximize

benefits and minimize harm.

4. The principle of justice requires the fair distribution of benefits and risks.

The limitations of this approach are related to the nature of the practice. Bioethics is neither a strictly

theoretical level, nor a strictly practical level, but improves in a dialectical space between. Bioethics is

at the interface of theory and practice. It is therefore not surprising that it is criticized by theorists

because it is not sufficiently unified and systematic, and also by practitioners because it is too abstract

and too far from reality. This weakness is mainly related to the difficulties of interdisciplinarity that

demand from researchers and practitioners not only an openness to different disciplines involved, but

also to take into account the inputs of these disciplines. We can make the same observation for any

applied ethics. Another limitation resides in the difficulties ensued by the so-called universality of this

four principles.

2.2.2 The Barcelona declaration policy proposals (1998)

From 1995 to 1998, the European Commission funded the “Basic Ethical Principles in European

Bioethics and Biolaw (BIOMED)” research project. Its goal was to identify the ethical principles relating

to autonomy, dignity, integrity and vulnerability as four important ideas or values for a European

bioethics and biolaw. The major summary of the BIOMED project was the partner’s Policy Proposals

to the European Commission, the Barcelona Declaration of 1998, a unique philosophical and political

1 Tom L. Beauchamp and James F. Childress. Principles of Biomedical Ethics, 6th Edition. Oxford: Oxford University Press, 2008.

2 See, for example, FP7 projects as PIAF, MIAUCE, PRESCIENT, SAPIENT, PRACTIS and PARIS.

IPATCH D4.3.

10

agreement between experts in bioethics and biolaw from many different countries. The experts of the

BIOMED project took back the main (bio)ethical principles and proposed a more reflexive version.

Value of "autonomy" (networked with integrity, dignity, and vulnerability) should be placed in the

context of care for others - a context that already presupposes an ethics of solidarity, responsibility

and justice (fairness).

1. Autonomy should not only be interpreted in the liberal sense of “permission” given for treatment and/or

experimentation. Five qualities should be considered: 1) the capacity of creation of ideas and goals for life, 2)

the capacity of moral insight, "self- legislation" and privacy, 3) the capacity of reflexion and action without

coercion, 4) the capacity of personal responsibility and political involvement, 5) the capacity of informed

consent. But autonomy cannot express the full meaning of respect for and protection of the human being.

Autonomy remains merely an ideal, because of the structural limitations given to it by human finitude and

dependence on biological, material and social conditions, lack of information for reasoning etc. We must

recognise the human person as a situated living body. Autonomy in relation to small children, persons in coma

and persons that are mentally ill should remain an open question.

2. Dignity is the property by virtue of which beings possess moral status. There are several contested

conceptions of dignity in European culture. Dignity is, variously, identified with the capacity for autonomous

action, the capacity for experiencing pain or pleasure, being human (in the biological sense) or being a living

organism or even system. Acknowledging various definitions our view is that it is nonetheless possible to argue

successfully that human being have duties towards the nonhuman part of living nature.

3. Integrity. The idea of integrity expresses the untouchable core, the basic condition of dignified life, both

physical and mental, that must not be subject to external intervention. Therefore respect for integrity is

respect for privacy and in particular for the patient's understanding of his or her own life and illness. Integrity

refers to the coherence of life of beings with dignity that should not be touched and destroyed. In relation to

human beings it is coherence of life, which is remembered from experiences and therefore can be told in a

narrative. It is the life story of a person, the narrative unity or history of human society and culture. Some

would also include the natural grown coherence in the life of animals and plants and finally the created

wholeness of the world, which makes the conditions for all life.

4. Vulnerability expresses two basic ideas. (a) It expresses the finitude and fragility of life, which, in those

capable of autonomy, grounds the possibility and necessity for all morality. (b) Vulnerability is the object of a

moral principle requiring care for the vulnerable. The vulnerable are those whose autonomy or dignity or

integrity are capable of being threatened. As such all beings who have dignity are protected by this principle.

But the principle also specifically requires not merely non-interference with the autonomy, dignity or integrity

of beings, but also that they receive assistance to enable them to realise their potential. From this premises it

follows that there are positive rights to integrity and autonomy, which grounds the ideas of solidarity, non-

discrimination and community.

(Source: The Barcelona declaration, 1998)

IPATCH D4.3.

11

2.3 Ethics of ICT, technoethics and ethics of surveillance

2.3.1 Technoethics: an inclusive and multidisciplinary ethical approach

Technoethics3 is a branch of applied ethics focused on issues related to technologies. It relies, firstly,

on philosophy of technique/technology; secondly, on the domain of Science and Technology Studies

(STS); and thirdly, on ethics and recent applied ethics, such as bioethics. As Luppicini defines:

“(…) technoethics is defined as an interdisciplinary field concerned with all ethical aspects of

technology within a society shaped by technology. It deals with human processes and

practices connected to technology which are embedded within social, political, and moral

spheres of life. It also examines social policies and interventions occurring in response to

issues generated by technology development and use. This includes critical debates on the

responsible use of technology for advancing human interests in society.4”

Technoethics encompasses ethics of ICT, although there is almost no technology that has been not

‘computerized’. It is more appropriate to talk about ethics of technology in general, so let us adopt the

term of “technoethics”.

As we already underlined, the scope of the deliverable is not to provide an historical panorama of

technoethics, but to provide some crucial and practical insights for an ethical assessment of

surveillance technologies embarked in the IPATCH concept.

2.3.2 European Group on Ethics in Science and New Technologies (EGE)

Since many years, there has been an increase in the importance of ethical issues related to ICT

research and technological developments. In that sense, the European Parliament and the Council

decided that research activities supported by the Framework Programmes should acknowledge

fundamental ethical principles, including those included in the Charter of Fundamental Rights of the

European Union and take into account opinions of the European Group on Ethics in Science and New

Technologies (EGE).

Fulfilling that goal, the European Commission has published for the FP7 a short ethical guidance

about conducting ICT projects5. Here are the following recommendations:

A responsible approach It is likely that most of the principles of the Charter of Fundamental Rights of the European Union will be relevant to the approach adopted by ICT researchers. These principles cover dignity, freedom, equality, solidarity, citizens’ rights and justice. Proposals must comply with Article 8 of the European Human Rights Convention. In particular, given the pervasive and ubiquitous nature of ICT and the many opportunities it offers, researchers should consider the sensitive implications of their proposals for privacy and autonomy. However, researchers should recognize that new dangers associated with the process of ICT research can exist. They should carry out a prior assessment of risk and identification of precautionary actions proportional to the potential risk/harm.

3 As Rocci Luppicini stated: “Although research in technoethics had been done earlier than this, official work under this heading began with Mario Bunge, the first scholar to coin the term “technoethics” (Bunge, 1977)”. See Rocci Luppicini and Rebecca Adell (eds), Handbook of research on technoethics”, IGI Global, 2009, p. 2.

4 Idem, p. 4.

5 http://cordis.europa.eu/fp7/ethics-ict_en.html

IPATCH D4.3.

12

Researchers should comply with national legislation, European Union legislation, respect international conventions and declarations and take into account the Opinions of the European Group on Ethics. However, consideration of ethical issues goes beyond simple compliance with current regulations and laws. Privacy and informed consent The right to privacy and data protection is a fundamental right and therefore applicable to ICT research. Researchers must be aware that volunteers have the right to remain anonymous. Researchers must comply with Data Protection legislation in the Member State where the research will be carried out regarding ICT research data that relates to volunteers. Informed consent is required whenever ICT research involves volunteers in interviews, behavioural observation, invasive and non-invasive experimentation, and accessing personal data records. The purpose of informed consent is to empower the individual to make a voluntary informed decision about whether or not to participate in the research based on knowledge of the purpose, procedures and outcomes of the research. Before consent is sought, information must be given specifying the alternatives, risks, and benefits for those involved in a way they understand. When such information has been given, free and informed consent must be obtained. Depending on the nature of the research, different consent procedures may be used. Special consideration must be given when volunteers have reduced autonomy or are vulnerable. The majority of European citizens view personal privacy as an important issue. Research, for example, on RFID11 and ICT for healthcare12, is likely to raise privacy issues. Therefore, researchers must ensure that the manner in which research outcomes are reported does not contravene the right to privacy and data protection. Furthermore, researchers must carefully evaluate and report the personal privacy implications of the intended use or potential use of the research outcomes. Wherever possible, they must ensure that research outcomes do not contravene these fundamental rights. Use of animals in ICT research In accordance with the Amsterdam protocol on animal protection and welfare, animal experiments must be replaced with alternatives wherever possible. Suffering by animals must be avoided or kept to a minimum. This particularly applies to animal experiments involving species which are closest to human beings. Thus ICT research involving animals should conform to the ethical principles of replacement, reduction, refinement and minimisation of suffering. Proposers must carefully justify animal experiments in cross-science proposals for non-medical objectives. Furthermore, they should identify the scientific areas which would benefit from knowledge gained through animal experiments. Proposers must be aware that Member States may have differing and possibly conflicting interpretations of animal welfare in research, and the research must meet regulations in the country in which it will be carried out. Specific guidance in some current sensitive areas ICT implants and wearable computing • ICT implants should only be developed if the objective cannot be achieved by less-invasive methods such as wearable computing devices and RFID tags. • To the extent that an individual, via an ICT implant or wearable computing device, becomes part of an ICT network, the operation of this whole network will need to respect privacy and data protection requirements. • ICT implants in healthcare are, in general, acceptable when the objective is saving lives, restoring health, or improving the quality of life. They should be treated in the same way as drugs and medical devices. • ICT implants to enhance human capabilities should only be developed: to bring individuals into the “normal” range for the population, if they so wish and give their informed consent; or to improve health prospects such as enhancing the immune system. Their use should be based on need, rather than economic resources or social position.

IPATCH D4.3.

13

• ICT implants or wearable computing devices must not: allow individuals to be located on a permanent and/or occasional basis, without the individual’s prior knowledge and consent; allow information to be changed remotely without the individual’s prior knowledge and consent; be used to support any kind of discrimination; be used to manipulate mental functions or change personal identity, memory, self- perception, perception of others; be used to enhance capabilities in order to dominate others, or enable remote control over the will of other people. • ICT implants should not be developed to influence future generations, either biologically or culturally. • ICT implants should be developed to be removed easily. eHealth and genetics Personal health data must be treated as ‘sensitive personal data’. ICT researchers using it have a duty of confidentiality equivalent to the professional duty of medical secrecy. Therefore: • The use of personal health data in ICT research for the purposes from which society as a whole benefits must be justified in the context of the personal rights. • The security of ICT in healthcare is an ethical imperative to ensure the respect for human rights and freedoms of the individual, in particular the confidentiality of data and the reliability of ICT systems used in medical care. • Proposers should be particularly aware when ICT is linked to sensitive medical areas such as the use of genetic material. • Proposers should access established general medical and genetics ethical guidance when formulating their proposals. ICT and Bio/Nano-electronics ICT-bio/nano-electronics has a strong potential for misuse. Consequently, proposers should pay particular attention to the guidelines in Section 2 in this area. Researchers involved in ICT-bio/nano-electronics research proposals should be aware that certain applications, e.g. miniaturised sensors, may have specific implications for the protection of privacy and personal data.

ICT-bio/nano-electronics research may overlap with other scientific disciplines such as biology. In these

situations proposers should draw upon the ethical guidance of that discipline.

The European Group on Ethics in Science and New Technologies, EGE6 is an independent, pluralist

and multidisciplinary body advising the European Commission on ethics in science and new

technologies in connection with Community legislation or policies. A very recent opinion, n°28, of

20/05/2014, is about Ethics of Security and Surveillance Technologies7.

The community of experts of EGE underlined the necessity to reflect about the tension

between security and freedom, and to nuance the pretended “need or right of security”:

“Security and freedom: do we need both? And can we enjoy both without the pursuit of one

jeopardising the other? These are two central questions addressed by the Opinion. The Opinion

challenges the notion that 'security' and 'freedom' can be traded against one another. While a

balance must be struck between competing values when they come into conflict, certain core

6 http://ec.europa.eu/bepa/european-group-ethics/welcome/index_en.htm

7 http://ec.europa.eu/bepa/european-group-ethics/publications/opinions/index_en.htm

IPATCH D4.3.

14

principles, such as human dignity, cannot be bartered with. The Opinion calls for a more

nuanced approach, in which the proportionality and effectiveness of security and surveillance

technologies are subject to rigorous assessment, and in which rights are prioritized rather than

traded.

At its core, the Opinion contends that an ethical foundation for the use of security and

surveillance technologies requires a broader understanding of the security concept,

encompassing the human and societal dimensions of security. Security is not simply protection

from physical harm, but a means to enable individual and collective flourishing. The Opinion

highlights the adverse consequences at stake when security becomes an end in its own right,

noting that excessive surveillance in the pursuit of security erodes trust, social cohesion,

solidarity and intellectual freedom.”

The EGE also propose 16 recommendations based on their analysis:

“recommendations to improve the application and oversight of technologies with a security

function (judicial oversight; a common European understanding of national security; and an EU

regulatory framework governing the use of drones)”;

“recommendations targeting the use of surveillance technologies (including the call for an EU

code of conduct for big data analytics; greater transparency in the use of algorithms; and closer

scrutiny of EU border surveillance systems)”;

“recommendations regarding measures designed to re-build trust and improve citizens' control

over the management of their data and privacy (including improved data protection

enforcement, protection for whistle-blowers and measures to improve education and

awareness among the public and practitioners)”.

In the Opinion n°288, they also underline the following core ethical concepts: 1) privacy and freedom;

2) autonomy and responsibility; 3) well-being and/or human flourishing; 4) justice. In addition, they

promote two procedural principles to enable trust: 1) transparency; 2) efficacy and proportionality.

2.4 A possible tool: Ethical Impact Assessment (EIA)

An impact assessment “gives decision-makers evidence regarding the need for EU action and the

advantages and disadvantages of alternative policy choices9”. Since 30 years, scholars, experts and

policy-makers have developed and tested several types of impact assessment, including environmental

impact assessments, risk assessment, technology assessment, privacy impact assessment, etc. The

majority of these scholars share the idea that it is necessary to involve stakeholders in the assessment

process.

As Venier et alii say in the PRESCIENT10 report: “With the expression “ethical impact assessment” we

refer to an instrument, which is currently usually conceived as a framework for examining the ethical

8 European Group on Ethics in Science and New Technologies to the European Commission (EGE), “Ethics of Security and Surveillance Technologies. Opinion n°28”, Brussels, May 2014, p. 71.

9 http://ec.europa.eu/smart-regulation/impact/index_en.htm

10 Venier, Silvia, Emilio Mordini, Michael Friedewald, Philip Schütz, Dara Hallinan, David Wright, Rachel L. Finn, Serge

Gutwirth, Raphaël Gellert, and Bruno Turn- heim, "Final Report – A Privacy and Ethical Impact Assessment Framework for Emerging Sciences and Technologies", Deliverable 4, PRESCIENT Project, 25 March 2013, p. 75. http: //www.prescient-project.eu

IPATCH D4.3.

15

implications of new technologies, which should aim at (a) identifying, and (b) addressing current or

emerging ethical issues arising from the development (research and development stage) and

deployment (application stage) of new technologies, particularly in the field of ICT”.

This tool helps to understand the socio-ethical impacts of countermeasures, but this is not clearly not

enough in terms of ethical analysis. This allows to define a field of ethical considerations, it cannot be

used everywhere and all the time, disconnected from the context, and above all, it does not permit any

ethical legitimacy. As Wright underlines: “thus, an ethical impact assessment must not only focus on

the ethics of a technology, but on the technology itself, its values, how it is perceived and how it is

used or might be used in the future, not only by itself but as a component in a larger technological

framework11”

Another interesting resource to cope with ethical issues of ICT and surveillance is David Wright12’s

paper, “A framework for the ethical impact assessment of information technology. The expression

“ethical impact assessment” refers to a tool, which is generally considered as a framework for

examining the ethical implications of new technologies, which should (1) identify, and (2) address

current or emerging ethical issues rising from the development (research and development stage) and

deployment (application stage) of new technologies, particularly in the field of ICT and surveillance

technologies. This anticipatory methodology, which aims at foreseen ethical issues at a very early

stage of technology development, is supposed to incorporate ethical reflection into the process of

research and innovation. This is also a participatory methodology, because it is supposed to involve

relevant stakeholders during the assessment process. David Wright based his analysis on the four

principles of Beauchamp and Childress, with a separate section on privacy and data protection. It

identifies significant ethical and social values and provides a set of questions relating to these issues.

Finally, it displays some ethical tools and procedural practices that could be used as a part of an

ethical impact assessment.

Beside this paper, in the context of the PRESCIENT project13, David Wright proposed a framework for

a social and ethical impact assessment. Here is a (non-exhaustive) list of ethical questions that could

be useful to address to ICT technologies. David Wright insists on the fact that those issues are

indicative and must be adapted to each context.

Autonomy (being let alone) – Does the project or new technology subject the individual or groups to

surveillance (listening, watching, tracking, detecting)?

Dignity – Does the project or technology intrude upon the individual’s dignity, as body scanning,

fingerprinting or electronic tagging arguably do.

Profiling and social sorting – Does the project, technology, application or service sort individuals into

groups according to some predetermined profile? Are some groups advantaged or disadvantaged as a result

of the sorting?

Informed consent – Have individuals freely given their explicit informed consent to being tracked or

targeted?

11 Wright, D., “A framework for the ethical impact assessment of information technology”, in Ethics Information Technology (2011) 13: 204.

12 From Trilateral Research & Consulting : http://trilateralresearch.com/

13 http://www.prescient-project.eu/prescient/index.php , Deliverable 4, Final Report — A Privacy and Ethical Impact Assessment Framework for Emerging Sciences and Technologies, 2013, p. 97.

IPATCH D4.3.

16

Freedom of speech and association – Does the technology “chill” freedom of speech and association (e.g.,

are “smart” CCTV cameras installed in public places able to eavesdrop on conversations)?

Trust – Will the technology or project impact trust and/or social cohesion? Will groups or individuals

believe they are not trusted by others, especially those who are in a stronger position of power?

Asymmetries of power – Will the project or technology enhance the power of some at the expense of

others?

Security – Is a new technology or project being introduced to improve security (and whose security is

actually being improved)? Will a perceived increase in security take precedence over other values such as

privacy? How can we know if the claims of the security proponents are valid? Who determines if security

should take precedence?

Unintended consequences – Does the project or technology have some consequences other than the purpose

for which it is being deployed?

Alternatives – Are there alternatives to the project or technology which are less intrusive upon an

individual’s rights or the impacts on society?

IPATCH D4.3.

17

3 Ethical analysis

The ethical assessment should not only focus on countermeasures but must take into account the

particular context of maritime piracy, which is extremely sensitive, because lives are at stake, both for

the crew and pirates. The ethical question should rather be formulated as such: how to speak as an

ethicist in case of armed conflict? Is it possible to have an ethics in time of war? It seems that ethics is

not applicable to wars or violent contexts.

It is nevertheless true that in the old philosophical tradition, there is a corpus entitled us bellum iustum,

which contains a doctrine of “just war theory” that justifies the war morally, through a series of criteria

split into two groups: "the right to go to war’' (jus ad bellum) and ‘'right conduct in war’' (jus in bello).

The first concerns the morality of going to war and the second with moral conduct within war. The

ethical demand in the IPATCH case could be understood as a contemporaneous version of the “Jus in

bello” – the “right conduct in war”.

This is not our standpoint: for us, it is impossible to have an ethical justification of any violent situation

neither any armed conflict. It means that the tentative of ethical impact assessment should not be

understood as a kind of justification.

Anyway, piracy is an act of illegitimate violence. Let’s take a step back and describe it other than in

legal acceptance. This violence is firstly quite "extraordinary" because it breaks the routine of the

cargo’s journey. Although pirate attacks are becoming more frequent, they are not systematic as it can

be found in time of war. The attacks are unexpected (even if they could be predicted). The violence of

the pirate attack can also be described as "ordinary" because it is not justified by any ideological

considerations. Pirate attacks have no justification other than the armed robbery.

This means that we cannot avoid thinking about this particular type of violence in the evaluation of

countermeasures. Therefore, it will be quite difficult to make a “neutral” ethical impact of the

countermeasures, as if they are detached of any context. By context, we mean firstly the specificity of

the context of production (upstream) and secondly, the critical implications for individuals and society

(downstream). The countermeasures, as technologies, are neither neutral nor value-free, either

upstream or downstream. As Hofman says: “technology expresses and influences the norms and

values of its social context14”.

3.1 Is an ethical impact assessment of the IPATCH countermeasures

possible?

According to the Description of Work, one of the scopes of the D4.3 focuses on “a quantitative and

qualitative scoring of the countermeasures in order to compare the levels of ethical & social

acceptance”. Looking back to the ethical tradition, a quantitative assessment, such as a quantitative

risk assessment, is clearly impossible. It is not possible to produce a quantitative scoring of the

countermeasures. However, we should provide a tentative of a qualitative and critical ethical impact

assessment, based on the ethical frameworks described above.

First it is very important to underline again the difficulty to assess the countermeasures detached from

any particular context. For example, a water cannon could appear totally harmless until the pirate

attack, while they are set off and become potentially harmful. It means that no technological artefact/

14 Hofman, B., “On value-judgments and ethics in health technology assessment, Poiesis and Praxis, 3(4), p. 289.

IPATCH D4.3.

18

countermeasure should be assessed outside of its context; and corollary that there is no universality

will be provided by the tentative of ethical impact assessment.

So providing an ethical assessment as a moral justification of the armed conflict is clearly not

acceptable. At the contrary, a simple ethical veto regarding every situation of armed violence is not

responsible from the ethicists’ point of view. The proposition below is only a sketch of an EIA, because

it cannot be achieved for two main reasons:

No public deliberation is possible. We cannot imagine making focus groups with all the

stakeholders: gathering captains, sailor crew and pirates in order to discuss the values and

norms embedded in the countermeasures is completely surrealistic. The deep openness of

the EIA process is impossible. It restricts the process and therefore the potential outcomes of

the EIA.

The context of the piracy, as a violent context of armed conflict, paralyses the dynamic of the

EIA, because it creates a sort of state of exception (Agamben, 2005), or a state of emergency.

In this book, Agamben investigates the increase of power by governments use in so-called

times of crisis. Within a state of emergency, Agamben coined the term of state of exception,

meaning a possible diminution and abandonment of constitutional rights in the process of

claiming this extension of power by a government. “In every case, the state of exception

marks a threshold at which logic and praxis blur with each other and a pure violence without

logos claims to realize an enunciation without any real reference15".

The state of exception transforms the conditions of possibility of any ethical analysis. It will be only

turned towards the values such as dignity and non-maleficence, because the possibility of conflict

neutralizes the other ethical principles. Of course, values such as integrity, autonomy, liberty, and

privacy keep being crucial and meaningful, but cannot be respected if people’s lives are not

considered as liveable (Butler, 2012).

3.2 Ethical impact assessment adapted to IPATCH

In order to build a tentative of an EIA of the countermeasures, it is interesting to classify them in levels

of ethical acceptability. The criteria used are the main ethical principles described above, in particular

those associated to the preservation of human life and human dignity (because of the constant

possibility of the armed conflict). Above all, we underline the principle of human dignity. It means that

the human being can never be treated as a mere means to other ends, but respected in its inherent

dignity as end in itself. It also means that all other principles and balancing tests must be performed

with this fundamental value in mind. It is close to the ‘bodily integrity principle’, coined by Martha

Nussbaum as a capability, “Being able to move freely from place to place; to be secure against violent

assault, (…)”. Second, the principle of non-maleficence is at stake because of the high possibility of

lethality due to countermeasures.

15 Agamben, G. (2005) “State of Exception”, The University of Chicago Press, p. 40.

IPATCH D4.3.

19

1. Non-maleficence (avoiding harm)16

Beauchamp and Childress say that ‘‘The principle of non- maleficence asserts an obligation not to

inflict harm on others’’ and that ‘‘Non-maleficence only requires intentionally refraining from actions

that cause harm. Rules of non-maleficence, therefore, take the form of ‘Do not do X’.

2. Dignity

Dignity is a key value, as evidenced by its being the subject of Article 1 (‘‘Human dignity is inviolable. It

must be respected and protected.’’) of the Charter of Fundamental Rights as well Article 25 which

specifically refers to the rights of the elderly (‘‘The Union recognises and respects the rights of the

elderly to lead a life of dignity and independence and to participate in social and cultural life.’’)

Dignity also features in Article 1 of the UN’s Universal Declaration of Human Rights, which states that

‘‘All human beings are born free and equal in dignity and rights.’’ Article 1 of the Charter of

Fundamental Rights provides that dignity is to be not only ‘‘respected’’, but also ‘‘protected.’’ This

means that public authorities are required not only to refrain from tampering or interfering with an

individual’s private sphere, but also to take steps in order to bring about the conditions allowing

individuals to live with dignity.

Following mainly those two ethical principles, a tentative of categorisation of the countermeasures

could be proposed. The countermeasures referred here are extracted from the list of D3.1. Table 1

shows the countermeasures and structures them in four criteria (the same that is used in D4.2 “Legal

Analysis).

Table 1: Classification of the countermeasures

Monitoring Non-Lethal Less than lethal Lethal

CCTV Safe muster point Warning shots Armed security guards

Night vision optics Citadel Acoustic devices Security vessels

Navigation lights Evasive manoeuvres Ballast pumps Military team

Additional lookouts Secure doors Water cannons Group transit

Search lights Deny use of tools Fire hoses and foam

monitors

Weather deck lighting Increased speed Electrified barriers

SSAS activated Steel bars Warning flares

Inform authorities Block external ladders Water spray rails

Alarm raised Razor wire Unarmed security

guards

Distress message Lasers

16 Wright, D., “A framework for the ethical impact assessment of information technology”, in Ethics Information Technology (2011) 13 : 204-206.

IPATCH D4.3.

20

1. Monitoring:

This category designates the countermeasures used only to prevent the arrival of pirates. Those

countermeasures are not harmful regarding any people involved in a possible conflicted situation

(sailor men and pirates). They are fully respectful of the key principles of non-maleficence and dignity.

They are planned to be avoidant and are focused on warning and prevention, before the attack.

Moreover, those countermeasures can also be designated as surveillance technologies – because

their function is to monitor constantly the ship from a pirate attack. Following this, it is important to

keep in mind that surveillance technologies are also submitted to several ethical critiques, especially

on privacy and data protection (see D.2.2), informed consent, employees monitoring, etc. (see section

2.3). In the context of privacy, they seem to be proportional to the potential threats.

2. Non-lethal

This category designates the harmless countermeasures used to repel pirates. They protect the ship

and the crew by reinforcing them materially. They are not harmful, just protective. It means that

nothing aggressive could happen using or having those countermeasures.

3. Less than lethal

This category encompasses the countermeasures suggesting the possibility of a conflict but with a low

level of belligerent response. The countermeasures are protective and defensive, but with a low level

of harmfulness towards people (crew and pirates). However, they are defensive so it means that a

situation of violence is occurring, so that the values of human dignity and non-maleficence are

harmed.

4. Lethal

This last category has a higher level of belligerent response, because of the higher possibility of

harmfulness while using the countermeasures while a pirate attack. Even if they are not lethal at first

glance, they could become lethal quickly while the situation of conflict gets out of the control.

Dignity and non-maleficence principles are not respected, because of the war/armed conflict situation

that places all the people involved (crew and pirates) in a state of exception, where moral norms and

values are usually put into brackets. Invoking states of exception because of violent confrontations

and conflicts is a traditional response that justifies violence and lack of proportionality between people.

We must be very careful with this framework of thinking.

IPATCH D4.3.

21

4 Conclusion and recommendations

We think that at least two dilemmas cross the IPATCH case. A dilemma reflects the moral

ambivalence and the complexity of some mundane situations; it is defined as a conflict between two

(or more) moral imperatives. Thinking by dilemmas is a part of practical ethics. An interesting following

to EIA could definitely be opening the reflection to the philosophical tradition. It is not the scope of the

deliverable, but we can mention a few relevant paths to think further the ethical dilemmas embedded

in IPATCH project.

The first dilemma has already been formulated above: “can we use the word ‘ethics’ in the context of

armed conflict?” Are all the weapons worth? Are some more reprehensible than others? Other more

ethical? These two terms are contradictory: just as there cannot be any "just" war, there can be no

weapons, as murder objects, that satisfy the ethical requirement of "You shall not murder." This is a

very antique dilemma that need to be renewed and re-thought through our post 9/11 era. Assessing

ethically non-lethal countermeasures will not render them ineffective. There is a need to qualify the

degrees of violence that are produced by the countermeasures, even if they are supposed to be non-

lethal. The non-lethality is a supposition: in reality, they could be lethal (except the monitoring

countermeasures) because they are a response to an attack. The situation could always degenerate.

Judith Butler’s recent work leads the second dilemma. In November 2012, she made a lecture during

the occasion of the Adorno Prize: "Can One Lead a Good life in a Bad Life?"17. Here the same

dilemma is expressed, but with more focus on the lived experience of individuals involved in the

situation of maritime piracy (such for the crew than for the pirates). How to lead a good life when

mundane life is structured by inequalities since the beginning?

In conclusion, we can underline first that this IPATCH case is a good example of the limits of the

Ethical, Legal and Social Issues (ELSI) approach in research projects. The objective of this

deliverable, according to the DoW, is “a quantitative and qualitative scoring of the countermeasures in

order to compare the levels of ethical & social acceptance”. This objective is impossible to fulfil, due to

the nature of ethics in one hand, and to the context of armed conflict, on the other hand. The nature of

ethics is not to evaluate but to provide critical normativity, grounded on philosophical tradition and

based on specific contexts. Evaluating each countermeasure with an ethical grid is highly risky,

because it opens to the standpoint of the ‘right conduct in war’. We showed above how a moral

justification of armed conflict is unacceptable. Therefore, we have to consider this aloofness as an

ethical posture in IPATCH.

However, we have proposed a typology that classifies the countermeasures, taking account of the

specific context. Apart from the monitoring technologies, we classified the countermeasures as the

following: monitoring, non-lethal, less than lethal and lethal. Our recommendation here is to encourage

the use of the non-lethal first, than the less than lethal; and to discourage the use the lethal ones. By

saying that, we want to underline again that ethical responsibilities are not supposed to be endured by

the countermeasures (e.g. technological artefacts) but by the crew using them. Situations of armed

conflict are psychologically hard to cope with; and it should be the main ethical concern of the IPATCH

case.

17 http://www.egs.edu/faculty/judith-butler/articles/can-one-lead-a-good-life-in-a-bad-life/

IPATCH D4.3.

22

5 References

Agamben, G., State of Exception, The University of Chicago Press, 2005.

Balibar, E., Violence and civility: on the limits of political anthropology, 2009

Beauchamp, T.L and Childress, J.L. Principles of Biomedical Ethics, 6th Edition. Oxford: Oxford

University Press, 2008.

Butler, J. Frames of War: When Is Life Grievable? (2009)

Hofman, B., “On value-judgments and ethics in health technology assessment”, Poiesis and Praxis,

3(4).

Lewer, N. and Schofield, S. Non-Lethal Weapons, a fatal attraction? - Zed Books Ltd, London, 1997.

Luppicini, R. and Adell, R. (eds) Handbook of research on technoethics”, IGI Global, 2009.

Wright, D., “A framework for the ethical impact assessment of information technology”, in Ethics

Information Technology (2011) 13:199–226

Venier, Silvia, Emilio Mordini, Michael Friedewald, Philip Schütz, Dara Hallinan, David Wright, Rachel

L. Finn, Serge Gutwirth, Raphaël Gellert, and Bruno Turn- heim, "Final Report – A Privacy and Ethical

Impact Assessment Framework for Emerging Sciences and Technologies", Deliverable 4,

PRESCIENT Project, 25 March 2013. http://www.prescient-project.eu

European Group on Ethics in Science and New Technologies to the European Commission (EGE),

“Ethics of Security and Surveillance Technologies. Opinion n°28”, Brussels, May 2014.

http://ec.europa.eu/bepa/european-group-ethics/publications/opinions/index_en.htm

http://ec.europa.eu/smart-regulation/impact/index_en.htm

http://cordis.europa.eu/fp7/ethics-ict_en.html

http://ec.europa.eu/bepa/european-group-ethics/welcome/index_en.htm