Engineering Ethical Multiagent Systems

33
Engineering Ethical Multiagent Systems Munindar P. Singh [email protected] https://www.csc.ncsu.edu/faculty/mpsingh/ (Work with Nirav Ajmeri and Amit Chopra) (With help from Hui Guo and Pradeep Murukannaiah) Department of Computer Science North Carolina State University

Transcript of Engineering Ethical Multiagent Systems

Page 1: Engineering Ethical Multiagent Systems

Engineering Ethical Multiagent Systems

Munindar P. [email protected]

https://www.csc.ncsu.edu/faculty/mpsingh/(Work with Nirav Ajmeri and Amit Chopra)

(With help from Hui Guo and Pradeep Murukannaiah)

Department of Computer ScienceNorth Carolina State University

Page 2: Engineering Ethical Multiagent Systems

Ethics in Multiagent SystemsEthics is an inherently multiagent concern, yet current approaches focus on single agents

Agent aspectsDilemmas

VirtueDecisionmaking

Living ina society

Societalaspects

Inter-actions

Specifyingstandards

Verifyingoutcomes

Learning

Page 3: Engineering Ethical Multiagent Systems

Ethical Dilemmas: No Good ChoicesContrast the following examples

Ethicaldilemmas

Falsifypreexistingconditions

Speedingto hospital

LesMiserables

Trolleyproblem

Page 4: Engineering Ethical Multiagent Systems

Elements ofethical systems

Dilemmas

Minimizedilemmas

Supportresolution

Account-ability

Expla-nations

Nottraceability

Notpunishment

Values

Diverse

Conflictingpreferences

Outcomes

Minimaldisparity

Trans-parency

Dynamism

Contextual

Adaptive

Page 5: Engineering Ethical Multiagent Systems

Fairness of a Central Technical EntityToday’s view of fairness involves how an agent deals with peopleSuch as a prediction algorithm or an autonomous vehicle

g g g g

Software3

I Autonomy is automation: complexity and intelligence

I Dilemmas a la trolley problems approached in an atomistic manner

Page 6: Engineering Ethical Multiagent Systems

Fairness of a Social Entity Equipped with SoftwareA social entity, assisted by software, wields power over peopleEthical concerns focused on social entity

g g g g

Ownerw

Software3

I Autonomy as a social construct; mirror of accountability

I Accountability rests with the social entity

I Powers and how they are exercised

Page 7: Engineering Ethical Multiagent Systems

Ethics in SocietyEthical considerations and accountability arise in how social entities interact

Software3

Stakeholderg

Stakeholderg

Software3

Software3

Stakeholderg

Stakeholderg

Software3

I The society itself is modeled

I Autonomy is in reference to a society

I Introduces a context to the decision making

Page 8: Engineering Ethical Multiagent Systems

Societal Model of EthicsRawls: “political not metaphysical”

I Ethics is a cousin of governance

I An ethical society is one that produces ethical outcomes for itsmembers

I Rawls’ difference principle: reduce the difference in outcomes betweenbest and worst

I Termed maximin in economic terms

Page 9: Engineering Ethical Multiagent Systems

Ethics in Society with SIPAsSIPA: Socially intelligent (personal) agent

Stakeholderg

3 3Stakeholder

g

Stakeholderg

3 3Stakeholder

g

I A multiagent system is a microsociety

I Each agent reflects the autonomy of its (primary) stakeholder

I How can we realize a multiagent system based on the valuepreferences of its stakeholders?

Page 10: Engineering Ethical Multiagent Systems

Sociotechnical SystemsCurrent AI research: atomistic, single-agent decision-making focused on ethical dilemmasCurrent social sciences research: Not computational in outlook

RequirementsValue Preferences

Principal . . . Principal

Stakeholders

NormsAssumptionsMechanisms

Metrics

Agent Agent

Data and Devices

represent represent

interact

communicate

realizedin

regulate

identify

specify

Social Tier

Technical Tier

Page 11: Engineering Ethical Multiagent Systems

Sociotechnical Systems (STS): A ComputationalNorm-Based SystemContext of interaction in which principals are represented by agents

I Principal: human or organization, a stakeholder who actsI Norm: directed social expectation between principals

I Types: Commitment, prohibition, authorization, power, . . .I Standards of correctness

I Prima facie, satisfaction is ethically desirable and violation undesirable

I Accountability: the power of a principal to call another to account forits actionsI Derives from normsI Provides an opportunity for principals to explain their actions

I Leading to prima facie judgments being reconsidered

I Is not traceability, which is merely a supporting mechanismI Is not blame and sanction, which are subsequent

Page 12: Engineering Ethical Multiagent Systems

Example: Information SharingFrank: committed to his mother Grace to share his location; visits aunt Hope in NYC

Frank GraceHope

Prefers pleasure and recognition

Prefers Frank’s safetyPrefers privacy

Frank’s dilemma: Which sharing policy to select

I Share with all: Pleasure for Frank ⇑I Share only with Grace: Safety for Grace ⇑I Share with no one: Privacy for Hope ⇑

Page 13: Engineering Ethical Multiagent Systems

Ethical Dilemmas in STS TermsDilemma: When there are no good choicesEthical dilemma: A dilemma involving values

Ethicaldilemmas

Valueto value

Normto value

Normto norm

Planto plan

Page 14: Engineering Ethical Multiagent Systems

Ethical STS: An Objective for Governance

An STS S is ethical

at time t for value preferences Vif and only ifS’s outcomes align with V at t

I Relativist: Value preferences provide frame of referenceI Omits norms—only value preferences matter

I Norms are crucial only for operationalization

I Dynamic: An STS may become ethical (unethical) due to responsive(unresponsive) governance

Page 15: Engineering Ethical Multiagent Systems

Ethics in the Large: Values and OutcomesEmphasizes social abstractions; deemphasizes internal decision-making

I Is an STS ethical?I Unethical systems make it difficult for principals to make ethical

decisions

I Norms operationalize the ethicsI Implement the “political” and sidestep some of the “metaphysical”I Reduce the complexity of individual decision making

I Accountability is conducive to innovationI Explanations provide a basis for reconsidering the norms

Page 16: Engineering Ethical Multiagent Systems

Ethics in the Large: Accountability and AdaptivityAn ethical STS presupposes good governance

An adaptive methodology undertaken by stakeholders of an STS

I Identify each stakeholder’s value preferencesI Specify the norms that support those value preferences

I Norms are operational refinements of value preferencesI Norms make accountability concrete

I A stakeholder’s SIPAI Adopts one or more rolesI Carries out its part of an enactmentI Evaluates outcomes on its (primary and secondary) stakeholders

I Whether values are promoted in alignment with the preferencesI Which norms are satisfied

I Iterate

Page 17: Engineering Ethical Multiagent Systems

Methodology and Tools for Ethical Multiagent SystemsA blend of software engineering, data science, political science, philosophy, and economics

I How can we effectively elicit value preferences from stakeholders?

I How can we identify norms to operationalize those values?I How can we support effective participation of stakeholders?

I How may we accommodate their conflicting value preferences?

I How can we evaluate outcomes and revisit the norms to improvealignment of outcomes and value preferences?

Page 18: Engineering Ethical Multiagent Systems

Architecture of a SIPAWhat must a SIPA represent and reason about to participate ethically in a multiagentsystem?A SIPA’s decision making takes into account its stakeholders, primary and secondary

World Model Social Model Stakeholder Model

Context Norms Goals

Actions Sanctions Values

Group Decision Module

Ethically Appropriate Action

Page 19: Engineering Ethical Multiagent Systems

Interaction in YumboA SIPA’s secondary stakeholders can change with the context

Identify stakeholders

Identify actions that satisfy goals

Select (ethical) action that maximizes social

experience and fairness

Perform action

Receive sanction

Elicit value preferences

Rank alternative actions

Page 20: Engineering Ethical Multiagent Systems

Choosing Ethical ActionYumbo SIPAs adapt VIKOR to trade off group and individual experience

Identify alternative actions

Compute normalized Manhattan distance and

rank alternatives (S)

Compute aggregated rank (Q) by trading off S and R

Elicit payoffs for alternatives based on

value preferences

Compute normalized Chebyshev distance and

rank alternatives (R)

Determine best and worst payoff for each

alternative

Choose action based on min(Q)

Page 21: Engineering Ethical Multiagent Systems

Setting: Information SharingPlaces, companion, and sharing policies

friend

family

colleague

stranger Share with all Share with common friends Share with companions Share with no one

Attending graduation ceremony

Safe ¬Sensitive

Presenting a conference paper

Safe ¬Sensitive

Studying in a library

Safe ¬Sensitive

Visiting an airport

Safe ¬Sensitive

Hiking at night

¬Safe ¬Sensitive

Being stuck in a hurricane

¬Safe ¬Sensitive

Visiting a bar with fake ID

Safe? Sensitive

Visiting a drug rehab center

Safe? Sensitive

Page 22: Engineering Ethical Multiagent Systems

Evaluation: Crowdsourcing StudySchnorff et al.’s privacy attitude survey: Level of comfort in sharing personal information

Level of comfort in setting context sharing policy

I Context includes place, activity, and social relationship withcompanions

I Places provided by us but not their safety and sensitivity ratings

Priming Based only on context to prime the users

Survey Based on context and value preferences (pleasure, privacy,recognition, safety)

Participants: 58 students enrolled in a mixed graduate andundergraduate-level computer science course

Casual Conscientious Cautious

Privacy Attitude

Page 23: Engineering Ethical Multiagent Systems

Example Numeric Utility Matrix for a StakeholderCaptures value preferences, one per rowDescribes the payoff resulting from applying the sharing policy in the specified place withthe specified companion

Place Companion PolicyValue

Pleasure Privacy Recognition Security

Graduation Family All 1 0 1 0Conference Co-workers None 0 1 0 0Library Friends All 1 0 0 0Airport Friends Common 0 1 0 0Hiking Alone All 1 0 0 1Hurricane Family All 1 0 0 1Bar Alone None 0 2 0 0Rehab Friends None 0 2 0 0

Page 24: Engineering Ethical Multiagent Systems

Multi-Criteria Decision MakingExample VIKOR calculations

Policy AlternativesFrank’s Values Hope’s Values Sy Ry Qy

Ple Pri Rec Saf Ple Pri Rec Saf

y1 All 10 5 10 5 5 0 5 5 3.5 3.0 0.75y2 Common 5 5 5 10 5 0 5 5 4.0 3.0 1.00y3 Grace 0 5 0 0 5 15 5 5 3.0 1.0 0.00

Weight, wx 1 1 1 1 1 3 1 1Max payoff, f ∗x 10 5 10 10 5 15 5 5Min payoff, f −x 0 5 0 0 5 0 5 5

Here,

I Sy is the Manhattan distance normalized to the maximum

I Ry is the Chebyshev distance normalized to the maximum

I Qy is the average of the two, normalized to [0, 1]

Page 25: Engineering Ethical Multiagent Systems

Measures of EthicalityFor each interaction, . . .

Best individual experience is the maximum utility obtained across theSIPA’s stakeholders during a single interaction

Worst individual experience is the minimum utility obtained across theSIPA’s stakeholders during a single interaction

Social experience is the utility obtained by a society as a whole divided bythe number of stakeholders

Fairness is the reciprocal of the difference between the best and worstindividual experience

Page 26: Engineering Ethical Multiagent Systems

Evaluation: SimulationStudy unit: A context-sharing SIPA

Decision-making strategies:

SYumbo: Policy based on VIKOR

Sprimary: Policy based on primarystakeholder’s preferences

Sconservative: Least privacy-violating sharingpolicy

Smajority: Most common sharing policy

Simulated societies

I Mixed

I Cautious

I Conscientious

I Casual

Privacy attitude distribution of societies

Highly unconcerned Highly concerned

Cautio

usCon

scie

ntious

Casual

Privacy Attitude

Page 27: Engineering Ethical Multiagent Systems

Experiment: Society of Mixed Privacy AttitudesResult: Yumbo improves fairness with some gain and no loss on the other metrics: socialexperience (shown), best individual experience, and worst individual experience

5 10 15 20

0.8

1

1.2

1.4

1.6

Timesteps ×100

So

cial

Exp

erie

nce

SYumbo Sprimary Sconservative Smajority

Page 28: Engineering Ethical Multiagent Systems

Experiments: Three Societies of Majority Privacy AttitudesResult: Yumbo yields superior social experience than the other decision-making strategiesacross three types of societies (shown) without hurting the other metrics (not shown)

5 10 15 20

0.8

1

1.2

1.4

1.6

Timesteps ×100

So

cial

Exp

erie

nce

Cautious

5 10 15 20

Timesteps ×100

Conscientious

5 10 15 20

Timesteps ×100

Casual

Page 29: Engineering Ethical Multiagent Systems

Comparing Metrics for a Society of Mixed PrivacyAttitudes

Strategy Social Best Worst Fairness

SYumbo 1.36 1.72 0.77 1.05Sprimary 1.29 1.79 0.58 0.83Sconservative 1.11 1.72 0.47 0.80Smajority 1.34 1.84 0.57 0.78

Bold indicates the winner

Page 30: Engineering Ethical Multiagent Systems

Comparing Metrics for a Societies with Majority PrivacyAttitudes

StrategyCautious Conscientious Casual

S. B. W. F. S. B. W. F. S. B. W. F.

SYumbo 1.54 1.66 1.23 2.27 1.33 1.53 0.87 1.51 1.24 1.46 0.77 1.45Spri. 1.51 1.77 1.08 1.46 1.25 1.59 0.68 1.10 1.13 1.47 0.58 1.13Scons. 1.37 1.75 1.06 1.46 1.09 1.52 0.61 1.10 0.87 1.34 0.45 1.34Smaj. 1.55 1.86 1.01 1.18 1.32 1.70 0.58 0.89 1.18 1.53 0.52 0.98

Bold indicates the winner

Page 31: Engineering Ethical Multiagent Systems

Conclusions

I Ethics inherently involves looking beyond one’s narrow interest

I Ethical considerations apply in mundane settings—anywhere agentsof multiple stakeholders interact

I A multiagent understanding of ethics can provide a foundation for ascience of security and privacy

Page 32: Engineering Ethical Multiagent Systems

Elements of Ethics: From Agents to Systems

Agent Level System Level

Scope Individual Individual in society

AutonomyIntelligence

and complexityDecision making insocial relationships

TransparencyAbout data

and algorithmsAbout normsand incentives

Bases of TrustConstruction

and traceabilityNorms and

accountability

FairnessPreset criteria:

StatisticsReasoning aboutothers’ outcomes

FocusDilemmas for

individualsSystem properties

Page 33: Engineering Ethical Multiagent Systems

Thanks!I Science of Security LabletI Laboratory of Analytic Sciences

http://www.csc.ncsu.edu/faculty/mpsingh/https://research.csc.ncsu.edu/mas/