Aaron Johnson (Yale) with Joan Feigenbaum (Yale) Paul Syverson (NRL)
description
Transcript of Aaron Johnson (Yale) with Joan Feigenbaum (Yale) Paul Syverson (NRL)
A Probabilistic Analysis of Onion Routing in a Black-box
Model10/29/2007
Workshop on Privacy in the Electronic Society
Aaron Johnson (Yale)
with
Joan Feigenbaum (Yale)
Paul Syverson (NRL)
Contributions
Contributions1. Use a black-box abstraction to create a
probabilistic model of onion routing
Contributions1. Use a black-box abstraction to create a
probabilistic model of onion routing
2. Analyze unlinkabilitya. Provide worst-case bounds
b. Examine a typical case
Related Work• A Model of Onion Routing with Provable
AnonymityJ. Feigenbaum, A. Johnson, and P. SyversonFC 2007
• Towards an Analysis of Onion Routing SecurityP. Syverson, G. Tsudik, M. Reed, and C. LandwehrPET 2000
• An Analysis of the Degradation of Anonymous ProtocolsM. Wright, M. Adler, B. Levine, and C. ShieldsNDSS 2002
Anonymous Communication
• Sender anonymity: Adversary can’t determine the sender of a given message
• Receiver anonymity: Adversary can’t determine the receiver of a given message
• Unlinkability: Adversary can’t determine who talks to whom
Anonymous Communication
• Sender anonymity: Adversary can’t determine the sender of a given message
• Receiver anonymity: Adversary can’t determine the receiver of a given message
• Unlinkability: Adversary can’t determine who talks to whom
How Onion Routing Works
User u running client Internet destination d
Routers running servers
u d
1 2
3
45
How Onion Routing Works
u d
1. u creates 3-hop circuit through routers
1 2
3
45
How Onion Routing Works
u d
1. u creates 3-hop circuit through routers
1 2
3
45
How Onion Routing Works
u d
1. u creates 3-hop circuit through routers
1 2
3
45
How Onion Routing Works
u d
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
1 2
3
45
How Onion Routing Works
u d
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
{{{m}3}4}1 1 2
3
45
How Onion Routing Works
u d
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
{{m}3}4
1 2
3
45
How Onion Routing Works
u d
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
{m}3
1 2
3
45
How Onion Routing Works
u d
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
m
1 2
3
45
How Onion Routing Works
u d
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
m’
1 2
3
45
How Onion Routing Works
u d
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
{m’}3
1 2
3
45
How Onion Routing Works
u d
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
{{m’}3}4
1 2
3
45
How Onion Routing Works
u d
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged
{{{m’}3}4}11 2
3
45
How Onion Routing Works
u d
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged.
4. Stream is closed.
1 2
3
45
How Onion Routing Works
u
1. u creates 3-hop circuit through routers
2. u opens a stream in the circuit to d
3. Data is exchanged.
4. Stream is closed.
5. Circuit is changed every few minutes.
1 2
3
45
d
Adversary
u
1 2
3
45
d
Active & Local
Anonymity
u 1 2
3
45
d
1.
2.
3.
4.
v
w
e
f
Anonymity
u 1 2
3
45
d
1. First router compromised
2.
3.
4.
v
w
e
f
Anonymity
u 1 2
3
45
d
1. First router compromised
2. Last router compromised
3.
4.
v
w
e
f
Anonymity
u 1 2
3
45
d
1. First router compromised
2. Last router compromised
3. First and last compromised
4.
v
w
e
f
Anonymity
u 1 2
3
45
d
1. First router compromised
2. Last router compromised
3. First and last compromised
4. Neither first nor last compromised
v
w
e
f
Black-box Abstraction
u d
v
w
e
f
Black-box Abstraction
u d
v
w
e
f
1. Users choose a destination
Black-box Abstraction
u d
v
w
e
f
1. Users choose a destination
2. Some inputs are observed
Black-box Abstraction
u d
v
w
e
f
1. Users choose a destination
2. Some inputs are observed
3. Some outputs are observed
Black-box Anonymity
u d
v
w
e
f
• The adversary can link observed inputs and outputs of the same user.
Black-box Anonymity
u d
v
w
e
f
• The adversary can link observed inputs and outputs of the same user.
• Any configuration consistent with these observations is indistinguishable to the adversary.
Black-box Anonymity
u d
v
w
e
f
• The adversary can link observed inputs and outputs of the same user.
• Any configuration consistent with these observations is indistinguishable to the adversary.
Black-box Anonymity
u d
v
w
e
f
• The adversary can link observed inputs and outputs of the same user.
• Any configuration consistent with these observations is indistinguishable to the adversary.
Probabilistic Black-box
u d
v
w
e
f
Probabilistic Black-box
u d
v
w
e
f
• Each user v selects a destination from distribution pv
pu
Probabilistic Black-box
u d
v
w
e
f
• Each user v selects a destination from distribution pv
• Inputs and outputs are observed independently with probability b
pu
Probabilistic Anonymityu dvw
ef
u dvw
ef
u dvw
ef
u dvw
ef
Indistinguishable configurations
Probabilistic Anonymityu dvw
ef
u dvw
ef
u dvw
ef
u dvw
ef
Indistinguishable configurations
Conditional distribution: Pr[ud] = 1
Black Box ModelLet U be the set of users.
Let be the set of destinations.
Configuration C• User destinations CD : U• Observed inputs CI : U{0,1}
• Observed outputs CO : U{0,1}
Let X be a random configuration such that:
Pr[X=C] = u puCD(u)
bCI(u) (1-b)1-CI(u) bCO(u) (1-b)1-CO(u)
Probabilistic Anonymity
The metric Y for the unlinkability of u and d in C is:
Y(C) = Pr[XD(u)=d | XC]
Note: There are several other candidates for a probabilistic anonymity metric, e.g. entropy
Probabilistic Anonymity
The metric Y for the unlinkability of u and d in C is:
Y(C) = Pr[XD(u)=d | XC]
Probabilistic Anonymity
The metric Y for the unlinkability of u and d in C is:
Y(C) = Pr[XD(u)=d | XC]
Exact Bayesian inference
• Adversary after long-term intersection attack
• Worst-case adversary
Probabilistic Anonymity
The metric Y for the unlinkability of u and d in C is:
Y(C) = Pr[XD(u)=d | XC]
Exact Bayesian inference
• Adversary after long-term intersection attack
• Worst-case adversary
Unlinkability given that u visits d:
E[Y | XD(u)=d]
Worst-case Anonymity
Worst-case Anonymity
Theorem 1: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when
1. pv=1 for all vu OR
2. pvd=1 for all vu
Let pu1 pu
2 pud-1 pu
d+1 … pu
Show max. occurs when, for all vu,ev = d orev = .
Worst-case Anonymity
Theorem 1: The maximum of E[Y | XD(u)=d] over (pv)vu occurs when
1. pv=1 for all vu OR
2. pvd=1 for all vu
Let pu1 pu
2 pud-1 pu
d+1 … pu
Show max. occurs when, for all vu, pv
ev = 1 for
some ev.
Show max. occurs when ev=d for all vu, or whenev = for all vu.
Worst-case EstimatesLet n be the number of users.
Worst-case Estimates
Theorem 2: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu
)b)) + O(logn/n)]
Let n be the number of users.
Worst-case Estimates
Theorem 2: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu
)b)) + O(logn/n)]
Theorem 3: When pvd=1 for all vu:
E[Y | XD(u)=d] = b2 + b(1-b)pud +
(1-b) pud/(1-(1- pu
d)b) + O(logn/n)]
Let n be the number of users.
Worst-case Estimates
Theorem 2: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu
)b)) + O(logn/n)]
Let n be the number of users.
Worst-case Estimates
Theorem 2: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu
)b)) + O(logn/n)]
b + (1-b) pud
Let n be the number of users.
Worst-case Estimates
Theorem 2: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu
)b)) + O(logn/n)]
b + (1-b) pud
E[Y | XD(u)=d] b2 + (1-b2) pud
Let n be the number of users.
Worst-case Estimates
Theorem 2: When pv=1 for all vu:
E[Y | XD(u)=d] = b + b(1-b)pud +
(1-b)2 pud [(1-b)/(1-(1- pu
)b)) + O(logn/n)]
b + (1-b) pud
E[Y | XD(u)=d] b2 + (1-b2) pud
Let n be the number of users.
Increased chance of total compromise from b2 to b.
Typical Case
Let each user select from the Zipfian distribution: pdi
= 1/(is)
Theorem 4:E[Y | XD(u)=d] = b2 + (1 − b2)pu
d+ O(1/n)
Typical Case
Let each user select from the Zipfian distribution: pdi
= 1/(is)
Theorem 4:E[Y | XD(u)=d] = b2 + (1 − b2)pu
d+ O(1/n)E[Y | XD(u)=d] b2 + (1 − b2)pu
d
Contributions1. Use a black-box abstraction to create a
probabilistic model of onion routing
2. Analyze unlinkabilitya. Provide worst-case bounds
b. Examine a typical case
Future Work
1. Extend analysis to other types of anonymity and to other systems.
2. Examine how quickly users distribution are learned.
3. Analyze timing attacks.