Inaccessible Entropy

Post on 11-Jan-2016

34 views 0 download

Tags:

description

Omer Reingold Weizmann Institute. Salil Vadhan Harvard University. Iftach Haitner Microsoft Research. Hoeteck Wee Queens College, CUNY. Inaccessible Entropy. TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A A A A. outline. Entropy - PowerPoint PPT Presentation

Transcript of Inaccessible Entropy

Inaccessible Entropy

Iftach HaitnerMicrosoft Research

Omer Reingold Weizmann Institute

Hoeteck WeeQueens College, CUNY

Salil Vadhan Harvard University

outline

Entropy

Secrecy & Pseudoentropy

Unforgeability & Inaccessible Entropy

Applications

Def: The Shannon entropy of r.v. X is

H(X) = ExÃX[log(1/Pr[X=x)]

H(X) = “Bits of randomness in X (on avg)”

0 · H(X) · log |Supp(X)|

Entropy

H(X ) = Exà X [log(1=Pr[X = x])]HHH(X ) =

X concentratedon single point

X uniform onSupp(X)

Conditional Entropy

H(X|Y) = EyÃY[H(X|Y=y)]

Chain Rule: H(X,Y) = H(Y) + H(X|Y)

H(X)-H(Y) · H(X|Y) · H(X)

H(X|Y) = 0 iff 9 f X=f(Y).

Worst-Case Entropy Measures

Min-Entropy: H1(X) = minx log(1/Pr[X=x])

Max-Entropy: H0(X) = log |Supp(X)|

H1(X) · H(X) · H0(X)

outline

Entropy

Secrecy & Pseudoentropy

Unforgeability & Inaccessible Entropy

Applications

Perfect Secrecy & Entropy

Def [Sh49]: Encryption scheme (Enc,Dec) has perfect secrecy if 8 m,m’ 2 {0,1}n

EncK(m) & EncK(m’) are identically distributed for a random key K.

Thm [Sh49]: Perfect secrecy ) |K| ¸ n

Perfect Secrecy ) |K|¸ n

Proof:

Perfect secrecy) (M,EncK(M)) ´ (Un,EncK(M))

for M,Un à {0,1}n

) H(M|EncK(M)) = n

Decryptability) H(M|EncK(M),K) = 0) H(M|EncK(M)) · H(K).

Computational Secrecy

Def [GM82]: Encryption scheme (Enc,Dec) has computational secrecy if 8 m,m’ 2 {0,1}n

EncK(m) & EncK(m’) are computationally indistinguishable.

) can have |K| ¿ n.

Where Shannon’s Proof Breaks

Computational secrecy) (M,EncK(M)) ´c (Un,EncK(M))

for M,Un à {0,1}n

) “Hpseudo(M|EncK(M))” = n

Decryptability) H(M|EncK(M)) · H(K).

Key point: can have Hpseudo(X) À H(X)e.g. X = G(Uk) for PRG G : {0,1}k! {0,1}n

Pseudoentropy

Def [HILL90]: X has pseudoentropy ¸ k iff there exists a random variable Y s.t.1. Y ´c X2. H(Y) ¸ k

Application of Pseudoentropy

Thm [HILL90]: 9 OWF ) 9 PRG

Proof outline:

OWF

X with pseudo-min-entropy ¸ H0(X)+poly(n)

X with pseudoentropy ¸ H(X)+1/poly(n)

PRG

hardcore bit [GL89]+hashing

repetitions

hashing

outline

Entropy

Secrecy & Pseudoentropy

Unforgeability & Inaccessible Entropy

Applications

Unforgeability

Crypto is not just about secrecy.

Unforgeability: security properties saying that it has hard for an adversary to generate “valid” messages.– Unforgeability of MACs, Digital Signatures– Collision-resistance of hash functions– Binding of commitment schemes

Ex: Collision-resistant Hashing

Shrinking: H(X|Y,F) ¸ k

Collision Resistance: From A’s perspective, X is determined by Y,F ) “accessible” entropy 0

A BF Ã F

F = { f : {0,1}n ! {0,1}n-k}

F

Y

X

XÃ {0,1}n

Y=F(X)

Ex: Collision-resistant Hashing

Collision Resistance: 9 function ¼s.t. X = ¼(F,Y,S1) except w/negligible prob.

A* BF Ã F

F = { f : {0,1}n ! {0,1}n-k}

F

Y

X

toss coins S1

toss coins S2

Ex: Collision-resistant Hashing

Collision Resistance: 9 function ¼s.t. X 2 {¼(F,Y,S1)} [ f-1(Y)c

A* BF Ã F

F = { f : {0,1}n ! {0,1}n-k}

F

Y

X

toss coins S1

toss coins S2

Measuring Accessible Entropy

Goal: A useful entropy measure to capture possibility that Hacc(X) ¿ H(X)

1st attempt: X has accessible entropy at most k if there is a random variable Y s.t.

1. Y ´c X2. H(Y) · k

Not useful! every X is indistinguishable from some Y of entropy polylog(n).

Inaccessible Entropy

Idea: Protocol (A,B) has inaccessible entropy if

H(A’s messages from B’s point of view) >

H(A*’s messages from A*’s point of view)

Real Entropy

Accessible Entropy

Real Entropy

A BB1

A1

B2

A2

Bm

Am

Def: The real entropy of (A,B) is

i H(Ai | B1,A1,…,Bi)

Accessible Entropy

A* BB1

A1

B2

A2

Bm

Am

Tosses coins Si

Sends message Ai

Privately outputs justification Wi (e.g. consistent coins of honest A)

coins S1

coins S2

coins Sm

What A* does at each round

W1

W2

Wm

Accessible Entropy

A* BB1

A1

B2

A2

Bm

Am

coins S1

coins S2

coins Sm

W1

W2

Wm

Def: (A,B) has accessible entropy at most k if for every PPT A*

i H(Ai|B1,S1,B2,S2,…,Si-1,Bi) · k

Remarks1. Needs

adjustmentin case A*

outputs invalidjustification.

2. Unbounded A* can achieve real entropy.

neverAssume

Ex: Collision-resistant Hashing

Real Entropy = H(Y|F)+H(X|Y,F)

= H(X|F)

= n

A BF Ã F

F = { f : {0,1}n ! {0,1}n-k}

F

Y

X

XÃ {0,1}n

Y=F(X)

Ex: Collision-resistant Hashing

Accessible Entropy = H(Y|F)+H(X|F,S1)

· (n-k) + neg(n)

A* BF Ã F

F = { f : {0,1}n ! {0,1}n-k}

F

Y

X

toss coins S1

toss coins S2

outline

Entropy

Secrecy & Pseudoentropy

Unforgeability & Inaccessible Entropy

Applications

Commitment Schemes

m

Commitment Schemes

COMMIT STAGE

S R

m

R

Commitment Schemes

S

REVEAL STAGE

Commitment Schemes

COMMIT STAGE

accept/reject

S R

m2{0,1}n

REVEAL STAGE(m,K)

Security of Commitments

COMMIT STAGE

accept/reject

S R

m2{0,1}n

REVEAL STAGE(m,K)

Hiding– Statistical– Computational

Binding– Statistical– Computational

COMMIT(m) & COMMIT(m’) indistinguishableeven to cheating R*

COMMIT(m) & COMMIT(m’) indistinguishableeven to cheating R*

Even cheating S*

cannot reveal(m,K), (m’,K’) with mm’

Even cheating S*

cannot reveal(m,K), (m’,K’) with mm’

Statistical Security?

COMMIT STAGE

accept/reject

S R

m2{0,1}t

REVEAL STAGE(m,K)

Hiding– Statistical– Computational

Binding– Statistical– Computational

Impossible!

Statistical Binding

COMMIT STAGE

accept/reject

S R

m2{0,1}n

REVEAL STAGE(m,K)

Hiding– Statistical– Computational

Binding– Statistical– Computational

Thm [HILL90,Naor91]: One-way functions ) Statistically Binding Commitments

Statistical Hiding

COMMIT STAGE

accept/reject

S R

m2{0,1}n

REVEAL STAGE(m,K)

Hiding– Statistical– Computational

Binding– Statistical– Computational

Thm [HNORV07]: One-way functions ) Statistically Hiding Commitments

Too Complicate

d!

Our Results I

Much simpler proof that OWF) Statistically Hiding Commitmentsvia accessible entropy.

Conceptually parallels [HILL90,Naor91] construction of PRGs & Statistically Binding Commitments from OWF.

“Nonuniform” version achieves optimal round complexity, O(n/log n) [HHRS07]

Our Results II

Thm: Assume one-way functions exist. Then:

NP has constant-round parallelizable ZK proofs with “black-box simulation”

m

constant-round statistically hiding commitments exist.

( * due to [GK96,G01], novelty is )

Statistically Hiding Commitments& Inaccessible Entropy

COMMIT STAGES R

MÃ{0,1}n

REVEAL STAGEM

Statistical Hiding:

H(M|C) = n - neg(n)

K

C

Statistically Hiding Commitments& Inaccessible Entropy

COMMIT STAGES* R

REVEAL STAGEM

Statistical Hiding:

H(M|C) = n - neg(n)

Comp’l Binding:

For every PPT S*

H(M|C,S1) = neg(n)

K

Ccoins S1

coins S2

OWF ) Statistically Hiding Commitments: Our Proof

OWF

(A,B) with real min-entropy ¸ accessible entropy+poly(n)

(A,B) with real entropy ¸ accessible entropy+log n

statistically hiding commitment

interactive hashing [NOVY92,HR07]

repetitions

cut & choose

(interactive) hashing [DHRS07]+UOWHFs [NY89,Rom90]

“m-phase” commitment

Cf. OWF ) Statistically Binding Commitment [HILL90,Nao91]

OWF

X with pseudo-min-entropy ¸ H0(X)+poly(n)

X with pseudoentropy ¸ H(X)+1/poly(n)

PRG

hardcore bit [GL89]+hashing

repetitions

hashing

Statistically binding commitment

expand output & translate

OWF ) Statistically Hiding Commitments: Our Proof

OWF

(A,B) with real min-entropy ¸ accessible entropy+poly(n)

(A,B) with real entropy ¸ accessible entropy+log n

statistically hiding commitment

interactive hashing [NOVY92,HR07]

repetitions

cut & choose

(interactive) hashing [DHRS07]+UOWHFs [NY89,Rom90]

“m-phase” commitment

OWF ) Inaccessible Entropy

A BChooselinearly indep.B1,…,Bm à {0,1}m

f : {0,1}n ! {0,1}m OWF

B1

h B1,Yi

XÃ {0,1}n

Y=f(X)

Real Entropy = n

Can show: Accessible Entropy · n-log n

Bm

h Bm,Yi

X

Claim: Accessible Entropy · n-log n

A* B

f : {0,1}n ! {0,1}m OWF.

B1

h B1,Yi

Bm

h Bm,Yi

X

Bt

h Bt,Yi

For simplicity, assume |f-1(y)| = 2k 8 y2 Im(f)

entropy · k

entropy · t = n-k-2log n

Claim: entropy = neg(n)

Claim: Accessible Entropy · n-log n

A* B

f : {0,1}n ! {0,1}m OWF.

B1

h B1,Yi

Bt

h Bt,Yi

For simplicity, assume |f-1(y)| = 2k 8 y2 Im(f).

t=n-k-2log n

Claim: 9 at most oneconsistent Y s.t. A* canproduce a preimage(except w/neg prob,)

Claim: Accessible Entropy · n-log n

A* B

f : {0,1}n ! {0,1}m OWF.

B1

h B1,Yi

Bt

h Bt,Yi

For simplicity, assume |f-1(y)| = 2k 8 y2 Im(f).

t=n-k-2log n

Claim: 9 at most oneconsistent Y s.t. A* canproduce a preimage(except w/neg prob,)

Im(f)poly(n)

Interactive Hashing Thms

[NOVY92,HR07]:A* can “control” at most 1 consistent value

Claim: Accessible Entropy · n-log n

A* B

f : {0,1}n ! {0,1}m OWF.

B1

h B1,Yi

Bm

h Bm,Yi

X

Bt

h Bt,Yi

For simplicity, assume |f-1(y)| = 2k 8 y2 Im(f)

entropy · k

entropy · t = n-k-2log n

entropy = neg(n)

Analysis holdswhenever |f-1(Y)| ¼ 2k

Choice of k contributesentropy · log n

Conclusion

Complexity-based cryptography is possible because of gaps between real & computational entropy.

Secrecypseudoentropy > real entropy

Unforgeabilityaccessible entropy < real entropy

What else can we do with inaccessible entropy?

Research Directions

Remove “parallelizable” condition from ZK result.

Use inaccessible entropy for new understanding/constructions of MACS and digital signatures.

Formally unify statistical hiding & statistical binding.