11 Accountability in International Data Exchange JOAN FEIGENBAUM May 4, 2010; NSF Inco-Trust.

39
1 Accountability in International Data Exchange JOAN FEIGENBAUM http://www.cs.yale.edu/homes/jf/ May 4, 2010; NSF Inco-Trust
  • date post

    21-Dec-2015
  • Category

    Documents

  • view

    214
  • download

    0

Transcript of 11 Accountability in International Data Exchange JOAN FEIGENBAUM May 4, 2010; NSF Inco-Trust.

11

Accountability in International Data

Exchange

JOAN FEIGENBAUMhttp://www.cs.yale.edu/homes/jf/

May 4, 2010; NSF Inco-Trust

22

Workshop Themes

• “This year’s focus is on international data exchange.”

• “… investigate the scientific foundations, design, and feasibility of a future international cyber architecture.”

• “… capable of enforcing diverse security and privacy policies.”

33

What is “data exchange”?

• Establishment of common goals, policies, and procedures

• Transmission of data• Maintenance and updating of databases

(and of goals, policies, and procedures)• Use of data• Auditing and compliance• . . .

44

What is “cyber architecture”?

• Informally, an “architecture” is the general structure of a class of systems, not the detailed design of a single system in the class.

“Internet architecture” vs. the current Internet

• Distinct types of system components• The actions that each type is intended to

perform or forbidden from performing• The resources available to each type• The manner in which different types of

components are intended to interact• . . .

55

computation computationstorage storage

policies policies

communication

66

Discussion Point

• There is nothing unique about international data exchange.

• Nations can disagree about which policies are needed or about how (and how strictly) to enforce agreed-upon policies, but so can states, cities, companies, individuals, etc.

• The relevant computer-security concept is administrative domain, not nation.

77

Alice and Bob (and ONLY they) can read this email

thread.

88

Alice and Bob (and ONLY they) can read this email

thread.

• Run key-establishment protocol.• Exchange encrypted email.• Take care to protect the plaintext.

99

Alice and Bob (and ONLY they) can read this email

thread.

• Run key-establishment protocol.• Exchange encrypted email.• Take care to protect the plaintext.

* Note that it is Alice and Bob themselves who are enforcing this policy.

10

Law-enforcement officials (and ONLY they)may access the Alert Database.

11

Law-enforcement officials (and ONLY they)may access the Alert Database.

RequesterAccess

Controller

AlertDB

PolicyDB

L.E. Cred.DB1

L.E. Cred.DBi

……

Req.; L.E. Cred.

Data

1212

©2010, Disney. All rights reserved.

1313

©2010, Disney. All rights reserved.

• DRM systems allow only authorized users to access the content and restrict the manner in which they can use it.

• Under the Fair-Use provisions of copyright law, certain categories of uses do not require authorization by the rights holder.

1414

©2010, Disney. All rights reserved.

• DRM systems allow only authorized users to access the content and restrict the manner in which they can use it.

• Under the Fair-Use provisions of copyright law, certain categories of uses do not require authorization by the rights holder.

? A user may need to access the work in order to determine how he wants to use it (and thus whether he needs authorization).

1515

Eavesdropping without a warrant is permitted if (and ONLY if) the source is not

a US person.

1616

Eavesdropping without a warrant is permitted if (and ONLY if) the source is not

a US person.

• The source of an Internet traffic stream (or even its geographic location) is hard to determine.

• As in the copyright case, the requester may need to access the traffic in order to (try to!) determine whether he needs a warrant.

1717

Eavesdropping without a warrant is permitted if (and ONLY if) the source is not

a US person.

• The source of an Internet traffic stream (or even its geographic location) is hard to determine.

• As in the copyright case, the requester may need to access the traffic in order to (try to!) determine whether he needs a warrant.

? What should he do with a US person’s traffic while he waits for a warrant, and how can he prove that that is what he has done?

1818

Cloud services for Yale undergraduates will be provided in accordance with this

contract.

1919

Cloud services for Yale undergraduates will be provided in accordance with this

contract.

• The data are owned by the students or by the university, as appropriate.

• Deletion by the owner will cause all copies of the data item to be destroyed (within time T).

• Data will not be stored in any of the following countries: . . .

• . . .

2020

Cloud services for Yale undergraduates will be provided in accordance with this

contract.

• The data are owned by the students or by the university, as appropriate.

• Deletion by the owner will cause all copies of the data item to be destroyed (within time T).

• Data will not be stored in any of the following countries: . . .

• . . .

? How can compliance with such a contract be adjudicated, e.g., how can a cloud-service provider prove that it has not done something?

2121

Discussion Point

• Most security and privacy policies that we know how to specify and implement are preventive. They are about authorization before the fact.

• We know less about accountability after the fact.

• Accountability will be more important than prevention in a cyber architecture for international data exchange.

2222

Support for an Accountability Agenda (1)

Weitzner et al., CACM 2008: “For too long, our approach to

information protection policy has been to seek ways to prevent information from ‘escaping’ beyond appropriate boundaries, then wring our hands when it inevitably does. This hide-it-or-lose-it perspective … on privacy, copyright, and surveillance is increasingly inadequate. … As an alternative, accountability must become a primary means through which society addresses appropriate use.”

2323

Support for an Accountability Agenda (2)

Lampson, CACM 2009: Misplaced emphasis on

prevention (“security based on locks”) rather than accountability (“security based on deterrence”) has resulted in unusable security technology that people do not understand and often work around.

2424

Research Goal: Define “Accountability”

• Everyone seems to agree that “accountability” is important in online activity, but people disagree about (or simply don’t know) what it means.

• Users will resist the construction of a “cyber architecture for accountability” if they think its cost (in, e.g., privacy, speed, or convenience) will be too high.

• Progress on definitions and terminology may defuse this resistance.

2525

Accountability via Policy Awareness and Adjudication

• Lampson: “Accountability is the ability to hold an entity, such as a person or organization, responsible for its actions.”

• Cyber-architectural components:– Policy languages– Policy-reasoning systems– Policy-aware monitoring and logging

• http://dig.csail.mit.edu

26

Examples of Accountability in DIG • Logging, analysis, and revision of policies

and queries– Policy assurance in Private Information

Retrieval– Data exchange in Fusion Centers

• Flagging but not stopping non-compliant actions

– Policy-aware mashups– License validation in Creative Commons– Social-web privacy

• DIG projects use Semantic-Web technology for policy expression and reasoning.

27

Standard Authorization

resource user controller resource

reasoner

(r, π)

Y/N

justification

P ID Loc. Size ID Age Sal Dept.

DepartmentEmployee

28

Multistage Authorization

ACCEPT ADJUDICATE REJECT

(ri, πi ) (r1, π1)

REASONER P LOCAL DATA REASONER P LOCAL

DATA

REASONER P LOCAL DATA

[vi ] [v1 ]…

(ro, πo) vo: START

29

2 Properties of “Accountable Systems”?

• Finite number of steps to a decision:

For all requests (r0, π0) and all policies P, all execution paths are finite and end at a terminal node.

• Best effort to authorize: For all (ri, πi), all policies P, and all non-

terminal nodes vi, if there is a path to the ACCEPT node, then [(ri+1, πi+1), vi+1] must be a next hop on one such path.

3030

Issues with this Approach

• Identifying an “entity” to hold responsible for its actions

• Deciding which events to monitor and log• Establishing a “closed system” or “trust

boundary” within which enforcement can’t be thwarted by lower-level insecurity

• Proving formal accountability properties of prototype applications

3131

Alternative Formulations(F., Jaggard, Wright)

3232

Alternative Formulations(F., Jaggard, Wright)

• Entity A is accountable with respect to policy P if A’s expected utility is reduced when A violates P.

3333

Alternative Formulations(F., Jaggard, Wright)

• Entity A is accountable with respect to policy P if A’s expected utility is reduced when A violates P.

• Entity A is accountable to entity B with respect to policy P if, whenever A violates P, its expected utility is reduced because of an action taken by B.

3434

Alternative Formulations(F., Jaggard, Wright)

• Entity A is accountable with respect to policy P if A’s expected utility is reduced when A violates P.

• Entity A is accountable to entity B with respect to policy P if, whenever A violates P, its expected utility is reduced because of an action taken by B.

• Ex. of “mediated” accountability w/out ID’ing A: Vickrey auctions with randomized tie breaking

3535

Alternative Formulations(F., Jaggard, Wright)

• Entity A is accountable with respect to policy P if A’s expected utility is reduced when A violates P.

• Entity A is accountable to entity B with respect to policy P if, whenever A violates P, its expected utility is reduced because of an action taken by B.

• Ex. of “mediated” accountability w/out ID’ing A: Vickrey auctions with randomized tie breaking

• Ex. of “automatic” accountability with ID’ing A: “self-destructing” stolen devices

3636

Research Goal: Explicate relationships betweenaccountability and other S&P properties.

Accountability

Identification Authorization

Tamper-proof Memory Privacy

Closed Systems Utility

3737

Discussion Point• Should we reserve the word “accountability” for

approaches that require identification? That would be consistent with the colloquial use of the phrase “to hold someone accountable.”

• Recall that Lampson’s goal is deterrence.• One can be deterred by the likelihood of

suffering negative consequences even if one will not be identified and “held accountable” in the colloquial sense. (See, e.g., FJW formulations.)

• Proposal: Allay fears about privacy by promoting “deterrence” instead of “accountability.”

3838

Discussion Point

• Technological enforcement is a weak link in this agenda.

• Authorization policies may be less enforceable than they seem if one considers the “real” policy.

• TPMs and other cyber-architectural approaches to enforcement have been resisted by users, because they are double-edged swords.

3939

Research Goal

• Enumerate important policy elements that must be strictly enforced.

• Would the cyber architecture needed to enforce them be acceptable to users? If not, can we use economic, legal, and other non-technological mechanisms to enforce them?

• Example: Data-destruction policies Are they as enforceable as data-retention policies?