Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential...

34
Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication

description

Interactive protocols performing a computation are central in TCS ( interactive proofs, communication complexity, cryptography, distributed computing, …) Interactive information theory extends classical information theory to the interactive setting, where information flows in several directions Interactive coding (cf. noisy coding) Interactive compression (cf. data compression) … Interactive Information Theory this talk

Transcript of Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential...

Page 1: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Gillat Kol (IAS)

joint work withAnat Ganor (Weizmann)

Ran Raz (Weizmann + IAS)

Exponential Separation of Information and Communication

Page 2: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Information theory was developed by Claude Shannon to study one-way data transmission“A mathematical theory of communication” 1948

It had a profound impact on many fields of science. Specifically, it is an incredibly useful tool in TCS

Recently, computational aspects of information theory are studied as a goal in its own right

Information Theory

Page 3: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Interactive protocols performing a computation are central in TCS (interactive proofs, communication complexity, cryptography, distributed computing, …)

Interactive information theory extends classical information theory to the interactive setting, where information flows in several directions Interactive coding (cf. noisy coding) Interactive compression (cf. data compression)

Interactive Information Theory

this talk

Page 4: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Alice has a string , chosen according to a publicly known distribution. She wants to send to Bob. How many bits does Alice need to send, so Bob can retrieve w.h.p?

Answer [Shannon‘48,Huffman‘52]:

Data Compression

!

bits𝑥

Entropy function “unpredictability”

Page 5: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Data Compression Theorem [S‘48,H‘52]:Any message can be compressed to its “information content”

Interactive Compression Problem [BBCR‘09]:Assume Alice and Bob engage in an interactive communication protocol (i.e., conversation). Can the protocol’s transcript be compressed to its “information content”?

Page 6: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Alice has input , Bob has input .They want to compute ( public). How many bits do they need to exchange?Applications to circuit complexity, streaming algorithms, data structures, distributed computing, property testing, …

Communication Complexity [Yao‘79]

𝑥

𝑓 (𝑥 , 𝑦 ) !

𝑦𝑚1 (𝑥 )𝑚2 ( 𝑦 ,𝑚1 )

𝑚3 (𝑥 ,𝑚1 ,𝑚2 ) . .adaptive!Protocol:

𝐸𝑄 ,𝐷𝐼𝑆𝐽 ,…

Page 7: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

is chosen according to a publicly known joint distribution .Players may use private and public randomness.They need to compute w.p. over and the randomnessCommunication Complexity of a protocol : max number of bits exchanged over and the randomnessCommunication Complexity of a function :

Distributional CC

Page 8: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Interactive Compression Problem [BBCR‘09]:Can the protocol’s transcript be compressed to its “information content”?

But how do we measure the information content of an interactive protocol?

Answer: Information Cost! [CSWY‘01,BYJKS‘04,BBCR‘09,…]

Seems to be the “right” analog of entropy– Extends – Has desirable properties, e.g., additivity,

equals amortized communication

Page 9: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Information CostThe amount of information players learn

about each other’s input from the interaction

are random variables, is ’s transcript

what Alice learns about from

what Bob learns about from

mutual information

Page 10: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Information Cost

what Bob learns about from

The amount of information players learn about each other’s input from the interaction

what Alice learns about from

Page 11: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Communication vs Information ?

Amount of information revealed

Number of bits exchanged

Page 12: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Communication vs Information ?

Easy direction “”: A bit sent by a Alice cannot give Bob more than bit of information about

Þ ∀,𝜇: () ≥

Page 13: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Communication vs Information ?

Other direction “”: can be much larger than

Interactive Compression Problem (more formal): Given a protocol , can be simulated by s.t. ? [BBCR‘09,BR‘10,BMY‘14,…] [Bra‘12]:

Page 14: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Data compression is a special caseIn data compression, Alice knows the whole message, thus can compress it altogetherIn interactive compression, no player knows the whole conversation before it takes place. Can compress round-by-round, but rounds giving only information still require communication bit

Why is the Interactive Case More Challenging?

Page 15: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Conclude:

No separation between and was known!

[BW,KLLRX‘12]: Almost all known techniques for lower bounding give the same bound for

Communication vs Information ?

Page 16: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Our Result: First Separation of and (explicit, boolean) s.t.: but

Interactive protocols cannot always be compressed to their information content!

New method for proving lower bounds:

Relative Discrepancy

Tight!

Page 17: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Alice has . Bob has . (independently).They want to compute w.p. on each copy Strong Direct Sum Problem: Does computing copies simultaneously require times the communication needed to solve a single copy?Equivalent to compression! […,BR‘10]

Direct Sum [80’s]

Corollary of Our Result: Strong Direct Sum doesn’t hold!

Initial motivation for defining

Page 18: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Example Separating and :The Bursting Noise Gamesearch problem + distribution

Page 19: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Complete binary tree Multilayer = layersDepth: multilayers

Alice gets , Bob gets . Each input contains a bit for every vertex in the tree.That is, , where , are bits

Input size is triple exp in !

Underlying Tree

multilayer c

𝑣

Page 20: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Complete binary tree Multilayer = layersDepth: multilayers

Alice gets , Bob gets . Each input contains a bit for every vertex in the tree

is not a product distribution, and are correlated!

Underlying Tree

multilayer c

𝑣

Page 21: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

𝑥𝑣=0

𝑦 𝑣=1

Typical VerticesAlice owns odd layersBob owns even layersThe player who owns dictates the correct child of :If Alice owns and , left is correct, otherwise right

Page 22: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

multilayer i

Typical VerticesAlice owns odd layersBob owns even layersThe player who owns dictates the correct child of :If Alice owns and , left is correct, otherwise right is typical (w.r.t ) if it is in multilayer and the sub-path in multilayer leading to has ≥ 80% correct children Subtrees of typical vertices typical leaves

≥ 80% correct

children

typicalvertices

Page 23: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Types of vertices: Non-noisy : Choose at random Noisy : Choose

independently at random multilayer i

typical leaves

The Distribution

Page 24: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Randomly select a multilayer

The Distribution non-noisy

noisy iid

Page 25: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Randomly select a multilayer multilayers : Set all vertices to non-noisy multilayers : Set all vertices to noisy

The Distribution non-noisy

noisy iid

multilayer i

Page 26: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Randomly select a multilayer multilayers : Set all vertices to non-noisy multilayers : Set all vertices to noisy multilayers : bursting noise Set non-typical verts to noisy Set typical verts to non-noisy

The Distribution

noisy multilayer i

non-noisy

noisy iid

Page 27: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Randomly select a multilayer multilayers : Set all vertices to non-noisy multilayers : Set all vertices to noisy multilayers : bursting noise Set non-typical verts to noisy Set typical verts to non-noisy

The Distribution

noisy multilayer i

non-noisy

noisy iid

typical leaves

Page 28: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Player’s Goal: Find and output the same typical leaf

Recall: is typical if the sub-path in multilayer leading to has ≥ 80% correct children

Bursting Noise Game

noisy multilayer i

non-noisy

noisy iid

typical leaves

Page 29: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Typical leaves are rare (prob)

If the players know , they can solve by exchanging bits

A binary search finds by exchanging bitsThat’s why we set The bursting noise makes the game harder, thus easier to show lower bound

: Sanity Check

noisy multilayer i

non-noisy

noisy iid

typical leaves

Page 30: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Protocol with Low

Page 31: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Starting from the root, on every vertex

The player who owns sends his bit w.p. 90% and sends the negation w.p. 10%Both players move to the child indicated by this bit

Output the leaf reached

Correctness: by Chernoff, w.h.p a typical leaf is reached

The Protocol (-bug fix)

typical leaves

noisy multilayer i

0 sent

90%

10%

0 sent

non-noisy

noisy iid

Page 32: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

If players always send their true bit, is revealed, thus

Why 90% and not 100%??

The 10% “hides”

typical leaves

noisy multilayer i

non-noisy

noisy iid

Page 33: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

At a non-noisy vertex, a player learns very little information as both input bits are the sameAt a noisy vertex, he learns 1 bitW.h.p a typical leaf was reachedand players only reached noisy vertices (multilayer ). The bursting noise is the “maximal amount of noise” tolerated by this protocol

typical leaves

noisy multilayer i

non-noisy

noisy iid

: Proof Intuition

Page 34: Gillat Kol (IAS) joint work with Anat Ganor (Weizmann) Ran Raz (Weizmann + IAS) Exponential Separation of Information and Communication.

Thank You!