Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential...

Post on 18-Jan-2018

218 views 0 download

description

Interactive protocols performing a computation are central in TCS ( interactive proofs, communication complexity, cryptography, distributed computing, …) Interactive information theory extends classical information theory to the interactive setting, where information flows in several directions Interactive coding (cf. noisy coding) Interactive compression (cf. data compression) … Interactive Information Theory this talk

Transcript of Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential...

Gillat Kol (IAS)

joint work withAnat Ganor (Weizmann)

Ran Raz (Weizmann + IAS)

Exponential Separation of Information and Communication

Information theory was developed by Claude Shannon to study one-way data transmission“A mathematical theory of communication” 1948

It had a profound impact on many fields of science. Specifically, it is an incredibly useful tool in TCS

Recently, computational aspects of information theory are studied as a goal in its own right

Information Theory

Interactive protocols performing a computation are central in TCS (interactive proofs, communication complexity, cryptography, distributed computing, …)

Interactive information theory extends classical information theory to the interactive setting, where information flows in several directions Interactive coding (cf. noisy coding) Interactive compression (cf. data compression)

Interactive Information Theory

this talk

Alice has a string , chosen according to a publicly known distribution. She wants to send to Bob. How many bits does Alice need to send, so Bob can retrieve w.h.p?

Answer [Shannon‘48,Huffman‘52]:

Data Compression

!

bits𝑥

Entropy function “unpredictability”

Data Compression Theorem [S‘48,H‘52]:Any message can be compressed to its “information content”

Interactive Compression Problem [BBCR‘09]:Assume Alice and Bob engage in an interactive communication protocol (i.e., conversation). Can the protocol’s transcript be compressed to its “information content”?

Alice has input , Bob has input .They want to compute ( public). How many bits do they need to exchange?Applications to circuit complexity, streaming algorithms, data structures, distributed computing, property testing, …

Communication Complexity [Yao‘79]

𝑥

𝑓 (𝑥 , 𝑦 ) !

𝑦𝑚1 (𝑥 )𝑚2 ( 𝑦 ,𝑚1 )

𝑚3 (𝑥 ,𝑚1 ,𝑚2 ) . .adaptive!Protocol:

𝐸𝑄 ,𝐷𝐼𝑆𝐽 ,…

is chosen according to a publicly known joint distribution .Players may use private and public randomness.They need to compute w.p. over and the randomnessCommunication Complexity of a protocol : max number of bits exchanged over and the randomnessCommunication Complexity of a function :

Distributional CC

Interactive Compression Problem [BBCR‘09]:Can the protocol’s transcript be compressed to its “information content”?

But how do we measure the information content of an interactive protocol?

Answer: Information Cost! [CSWY‘01,BYJKS‘04,BBCR‘09,…]

Seems to be the “right” analog of entropy– Extends – Has desirable properties, e.g., additivity,

equals amortized communication

Information CostThe amount of information players learn

about each other’s input from the interaction

are random variables, is ’s transcript

what Alice learns about from

what Bob learns about from

mutual information

Information Cost

what Bob learns about from

The amount of information players learn about each other’s input from the interaction

what Alice learns about from

Communication vs Information ?

Amount of information revealed

Number of bits exchanged

Communication vs Information ?

Easy direction “”: A bit sent by a Alice cannot give Bob more than bit of information about

Þ ∀,𝜇: () ≥

Communication vs Information ?

Other direction “”: can be much larger than

Interactive Compression Problem (more formal): Given a protocol , can be simulated by s.t. ? [BBCR‘09,BR‘10,BMY‘14,…] [Bra‘12]:

Data compression is a special caseIn data compression, Alice knows the whole message, thus can compress it altogetherIn interactive compression, no player knows the whole conversation before it takes place. Can compress round-by-round, but rounds giving only information still require communication bit

Why is the Interactive Case More Challenging?

Conclude:

No separation between and was known!

[BW,KLLRX‘12]: Almost all known techniques for lower bounding give the same bound for

Communication vs Information ?

Our Result: First Separation of and (explicit, boolean) s.t.: but

Interactive protocols cannot always be compressed to their information content!

New method for proving lower bounds:

Relative Discrepancy

Tight!

Alice has . Bob has . (independently).They want to compute w.p. on each copy Strong Direct Sum Problem: Does computing copies simultaneously require times the communication needed to solve a single copy?Equivalent to compression! […,BR‘10]

Direct Sum [80’s]

Corollary of Our Result: Strong Direct Sum doesn’t hold!

Initial motivation for defining

Example Separating and :The Bursting Noise Gamesearch problem + distribution

Complete binary tree Multilayer = layersDepth: multilayers

Alice gets , Bob gets . Each input contains a bit for every vertex in the tree.That is, , where , are bits

Input size is triple exp in !

Underlying Tree

multilayer c

𝑣

Complete binary tree Multilayer = layersDepth: multilayers

Alice gets , Bob gets . Each input contains a bit for every vertex in the tree

is not a product distribution, and are correlated!

Underlying Tree

multilayer c

𝑣

𝑥𝑣=0

𝑦 𝑣=1

Typical VerticesAlice owns odd layersBob owns even layersThe player who owns dictates the correct child of :If Alice owns and , left is correct, otherwise right

multilayer i

Typical VerticesAlice owns odd layersBob owns even layersThe player who owns dictates the correct child of :If Alice owns and , left is correct, otherwise right is typical (w.r.t ) if it is in multilayer and the sub-path in multilayer leading to has ≥ 80% correct children Subtrees of typical vertices typical leaves

≥ 80% correct

children

typicalvertices

Types of vertices: Non-noisy : Choose at random Noisy : Choose

independently at random multilayer i

typical leaves

The Distribution

Randomly select a multilayer

The Distribution non-noisy

noisy iid

Randomly select a multilayer multilayers : Set all vertices to non-noisy multilayers : Set all vertices to noisy

The Distribution non-noisy

noisy iid

multilayer i

Randomly select a multilayer multilayers : Set all vertices to non-noisy multilayers : Set all vertices to noisy multilayers : bursting noise Set non-typical verts to noisy Set typical verts to non-noisy

The Distribution

noisy multilayer i

non-noisy

noisy iid

Randomly select a multilayer multilayers : Set all vertices to non-noisy multilayers : Set all vertices to noisy multilayers : bursting noise Set non-typical verts to noisy Set typical verts to non-noisy

The Distribution

noisy multilayer i

non-noisy

noisy iid

typical leaves

Player’s Goal: Find and output the same typical leaf

Recall: is typical if the sub-path in multilayer leading to has ≥ 80% correct children

Bursting Noise Game

noisy multilayer i

non-noisy

noisy iid

typical leaves

Typical leaves are rare (prob)

If the players know , they can solve by exchanging bits

A binary search finds by exchanging bitsThat’s why we set The bursting noise makes the game harder, thus easier to show lower bound

: Sanity Check

noisy multilayer i

non-noisy

noisy iid

typical leaves

Protocol with Low

Starting from the root, on every vertex

The player who owns sends his bit w.p. 90% and sends the negation w.p. 10%Both players move to the child indicated by this bit

Output the leaf reached

Correctness: by Chernoff, w.h.p a typical leaf is reached

The Protocol (-bug fix)

typical leaves

noisy multilayer i

0 sent

90%

10%

0 sent

non-noisy

noisy iid

If players always send their true bit, is revealed, thus

Why 90% and not 100%??

The 10% “hides”

typical leaves

noisy multilayer i

non-noisy

noisy iid

At a non-noisy vertex, a player learns very little information as both input bits are the sameAt a noisy vertex, he learns 1 bitW.h.p a typical leaf was reachedand players only reached noisy vertices (multilayer ). The bursting noise is the “maximal amount of noise” tolerated by this protocol

typical leaves

noisy multilayer i

non-noisy

noisy iid

: Proof Intuition

Thank You!