A Security Model for Anonymous Credential Systems

Post on 13-Jan-2016

27 views 0 download

description

A Security Model for Anonymous Credential Systems. 26 th August I-NetSec, SEC 2004 Andreas Pashalidis and Chris J. Mitchell. Agenda. Why do we need AC Systems ? How do AC Systems work ? The model. What is “security” in an AC System ? What is “privacy” in an AC System ? Open questions. - PowerPoint PPT Presentation

Transcript of A Security Model for Anonymous Credential Systems

1

A Security Model forAnonymous Credential Systems

26th AugustI-NetSec, SEC 2004

Andreas Pashalidis and Chris J. Mitchell

2

Agenda

Why do we need AC Systems ? How do AC Systems work ? The model. What is “security” in an AC System ? What is “privacy” in an AC System ? Open questions.

3

Agenda

Why do we need AC Systems ? How do AC Systems work ? The model. What is “security” in an AC System ? What is “privacy” in an AC System ? Open questions.

4

Why do we need AC Systems ?

5

Why do we need AC Systems ?

6

Why do we need AC Systems ?

7

Why do we need AC Systems ?

8

Why do we need AC Systems ?

9

Why do we need AC Systems ?

10

Why do we need AC Systems ?

We want to

prevent this!

(technically – not through legislation)

11

Agenda

Why do we need AC Systems ? How do AC Systems work ? The model. What is “security” in an AC System ? What is “privacy” in an AC System ? Conclusions.

12

How do AC Systems work ?

13

How do AC Systems work ?

14

How do AC Systems work ?

15

How do AC Systems work ?

16

How do AC Systems work ?

17

How do AC Systems work ?

18

How do AC Systems work ?

19

How do AC Systems work ?

20

Agenda

Why do we need AC Systems ? How do AC Systems work ? The model. What is “security” in an AC System ? What is “privacy” in an AC System ? Open questions.

21

Why another model ?There is a formal model in [CL01]*.Based on simulatability:

Ideal functionality guarantees security and privacy; cryptosystem has to “meet” this standard.

Relationship between different notions is somewhat hidden.

Adversary cannot corrupt parties adaptively.

Alternative model based on different ideas, in particular the [BR93]** model.

*Camenisch & Lysyanskaya “An Efficient System for Non-transferable Anonymous Credentials with Optional Anonymity Revocation”, Eurocrypt 2001

**Bellare & Rogaway “Entity Authentication and Key Distribution” Crypto 1993

22

What is a AC System ?

It is a 10-tuple consisting of

Five Sets: Users, Issuers, Verifiers, Pseudonyms, Credential Types.

Three Protocols: Pseudonym Establishment, Credential Issuing, Credential Showing.

One Algorithm: Initialisation.

One number: Security Parameter (k).

23

What is a AC System ?

It is a 10-tuple consisting of

Five Sets: Users, Issuers, Verifiers, Pseudonyms, Credential Types.

Three Protocols: Pseudonym Establishment, Credential Issuing, Credential Showing.

One Algorithm: Initialisation.

One number: Security Parameter (k).

Turing machine

s

24

The model

Users, Issuers and Verifiers execute the protocols with each other directly (not through an attacker who controls all communications).

Several notions of security and privacy.

Each notion is defined by means of a game between two Turing machines:

Challenger vs. Adversary.

25

The games – three phases

1) Challenger chooses k, runs initialisation, controls all Users, Issuers, Verifiers.

2) Adversary issues queries to Challenger. A query makes the Challenger either initiate a protocol between a user and an

issuer or a user and a verifier, or hand control of a party over to the

Adversary.

26

The games – three phases

3) No more queries. Adversary runs credential showing protocol with an uncorrupted verifier. If verifier accepts Adversary wins; otherwise he loses.

The notion of security is satisfied iff no Adversary can win the game with a non-negligible probability (in the security parameter k).

27

Agenda

Why do we need AC Systems ? How do AC Systems work ? The model. What is “security” in an AC System ? What is “privacy” in an AC System ? Open questions.

28

What is “security” in an ACS ?

Three notions of security.

Pseudonym owner protection.

Credential Unforgeability.

Credential Non-transferability.

29

Pseudonym owner protection

30

Pseudonym owner protection

31

Pseudonym owner protection

32

Pseudonym owner protection

33

Pseudonym owner protection

34

Pseudonym owner protection

35

Pseudonym owner protection

36

Pseudonym owner protection

37

Pseudonym owner protection

“Nobody, even if colluding with others

(users, issuers and verifiers) should be

able to successfully show a credential on

a pseudonym of which he is not the

owner (i.e. on a pseudonym which was

not established by himself).”

38

Credential Unforgeability

39

Credential Unforgeability

40

Credential Unforgeability

41

Credential Unforgeability

“The only way for a user to successfully

show a credential is by having previously

obtained it from the issuer.”

42

Credential Non-Transferability

43

Credential Non-Transferability

44

Credential Non-Transferability

45

Credential Non-Transferability

46

Credential Non-Transferability

47

Credential Non-Transferability

48

Credential Non-Transferability

Needs additional

assumption:not all secrets

may be shared!

49

Credential Non-Transferability

“Even if colluding with others who have

obtained a credential, a user can

successfully show it only if it was issued

to him personally.”

50

Credential Non-Transferability

Non-Transferability implies Unforgeability.

Definitions make this explicit.

Non-Transferability not always required.

51

Agenda

Why do we need AC Systems ? How do AC Systems work ? The model. What is “security” in an AC System ? What is “privacy” in an AC System ? Open questions.

52

What is “privacy” in an ACS ?

Three notions of privacy.

Indistinguishability of pseudonyms.

Unlinkability of pseudonyms.

Anonymity of users.

53

Pseudonym Indistinguishability

54

Pseudonym Indistinguishability

55

Pseudonym Indistinguishability

Was this Alice or

Bob?

56

Pseudonym Indistinguishability

Pseudonyms should not reveal any information about users.

The notion applies only to the Pseudonym Establishment protocol.

57

The Indistinguishability Game1) Adversary chooses two uncorrupted

users.

2) Challenger selects one of them at random and makes him establish a new pseudonym with a corrupted organisation.

3) Adversary has to tell which of the two users it was.

Adversary should not be correct significantly more than 50% of the time.

58

Pseudonym Unlinkability

Organisations should not be able to link pseudonyms corresponding to the same user.

Adversary models colluding organisations: Knows users in the system. Knows pseudonyms in the system. Does not know pseudonym-to-user

mapping.

59

Pseudonym UnlinkabilityPhase 2 of Unlinkability Game: Adversary selects pseudonym/organisation

pairs and issues queries to run protocols. Challenger selects corresponding user and

makes him run protocol with organisation. Challenger responds with {true, false}.

Phase 3 of Unlinkability Game: Adversary outputs two pseudonyms; if they

correspond to the same user, he wins.

60

Pseudonym UnlinkabilityWhat should be the Adversary’s maximum probability of success ?

Each pair of distinct pseudonyms carries, in the Adversary’s view, a probability that it corresponds to the same user.

The maximum of these probabilities (at the end of the game) is the Adversary’s success probability.

Adversary is restricted to two inherent linking strategies.

61

Linking Strategies

Query for running credential showing protocol returns “true”: At least one of the pseudonyms on which a

similar credential was issued, up to that point in time, corresponds to the same user.

Query for running credential showing protocol returns “false”: None of the pseudonyms on which a similar

credential was issued, up to that point in time, corresponds to the same user.

62

Unlinkability Example

63

Unlinkability Example

64

Unlinkability Example

65

Unlinkability Example

66

Unlinkability Example

67

Unlinkability Example

68

Unlinkability Example

25%25%25%25%

25%25%25%25%

69

Unlinkability Example

25%25%25%25%

33%33%33%0%

70

Unlinkability Example

33%33%33%0%

33%33%33%0%

71

Unlinkability Example

50%0%50%0%

33%33%33%0%

72

Unlinkability Example

50%0%50%0%

33%33%33%0%

max

73

Pseudonym UnlinkabilityAccording to the two strategies, these two pseudonym pairs are most likely to belong to the same user (each with a 50% probability).

So, in this instance, the AC System offers Unlinkability, iff the Adversary cannot break this 50% bound by a non-negligible quantity.

Unlinkability implies Indistinguishability (Theorem I).

74

Anonymity of UsersAnonymity is a result of: The probability distribution according to

which users are selected when establishing a new pseudonym in the system.

Unlinkability of pseudonyms.

Unlinkability leads to Anonymity; no need for a separate game for Anonymity.

Anonymity is naturally expressed in information-theoretic metric (entropy).

75

Agenda

Why do we need AC Systems ? How do AC Systems work ? The model. What is “security” in an AC System ? What is “privacy” in an AC System ? Open questions.

76

Open Questions

The two linking strategies enable more complicated deduction types to be made.

What is the optimal way to make these?

Naïve methods appear to require exponential running times.

77

Open Questions“Linkability” Problem: Given a transcript of events in the AC system, output two distinct pseudonyms that most likely belong to the same user.

Question 1: To which complexity class does the above problem belong?

Question 2: If it is not in P, can we still obtain “good” pairs in polynomial time? (i.e. how approximable is the problem?)

Similar to “Disclosure Attack”* in MIX context.*Agrawal, Kesdogan & Penz, “Probabilistic Treatment of MIXes to Hamper Traffic

Analysis”, IEEE Symposium on Security and Privacy, 2003

78

Open QuestionsProve or disprove equivalence between this model and the one in [CL01]*.

Refine model to cover additional properties Anonymity revocation. One-show credentials.

Prove schemes (in)secure under model.*Camenisch & Lysyanskaya “An Efficient System for Non-transferable Anonymous Credentials with

Optional Anonymity Revocation”, Eurocrypt 2001

79

ConclusionWe described an alternative security model for anonymous credential systems.

In the process we gained insight into relevant issues. identified the “Linkability” problem.

Complexity theory -> Information theoretic anonymity metrics.

80

ConclusionWe described an alternative security model for anonymous credential systems.

In the process we gained insight into relevant issues. identified the “Linkability” problem.

Complexity theory -> Information theoretic anonymity metrics.

Model + Probability theory

81

ConclusionWe described an alternative security model for anonymous credential systems.

In the process we gained insight into relevant issues. identified the “Linkability” problem.

Complexity theory -> Information theoretic anonymity metrics. Feedback welcome!

Model + Probability theory

82

Thanks!Questions?

www.xrtc.com