Ulams Game and Universal Communications Using Feedback Ofer Shayevitz June 2006.
-
Upload
lillie-lorance -
Category
Documents
-
view
217 -
download
0
Transcript of Ulams Game and Universal Communications Using Feedback Ofer Shayevitz June 2006.
Ulam’s Game and Universal Ulam’s Game and Universal Communications Using FeedbackCommunications Using Feedback
Ofer ShayevitzOfer Shayevitz
June 2006June 2006
Introduction to Ulam’s GameIntroduction to Ulam’s Game
Are you familiar with this game?Are you familiar with this game?
How many y/n questions are needed to How many y/n questions are needed to separate 1000 objects?separate 1000 objects?
M objects M objects log log22(M) questions(M) questions
What Happens When What Happens When We Lie?We Lie?
Separate two objects - One lie allowedSeparate two objects - One lie allowed Precisely three questions are required !Precisely three questions are required !
Separate M objects – One lie allowedSeparate M objects – One lie allowed 2log2log22(M) + 1 questions are sufficient!(M) + 1 questions are sufficient! But we can do better…But we can do better…
It was shown [Pelc’87] that the minimal # of It was shown [Pelc’87] that the minimal # of questions is the least positive integer n satisfyingquestions is the least positive integer n satisfying
M objects, L lies – Very Difficult !M objects, L lies – Very Difficult !
1 2 is even
M n+1 1 2 is odd
n
n
M n M
n M
Ulam’s Game as a Problem of Ulam’s Game as a Problem of Reliable CommunicationsReliable Communications
Alice(Transmitter)
Bob(Receiver)
Charlie(Adversary)
Feedback Channel
Forward Channel
Communication Rate DefinedCommunication Rate Defined
Alice transmits one of M possible Alice transmits one of M possible messages messages by saying by saying yes/no = yes/no = 1 1 bit bit M messages M messages log log22(M) bits(M) bits
The channel can be used The channel can be used nn times (seconds) times (seconds) Charlie can lie Charlie can lie a fraction a fraction pp of the time of the time no no
more than more than npnp lies (errors) lies (errors) Define the Define the communication rate Rcommunication rate R
2log bitssec
MR
n
Channel Capacity DefinedChannel Capacity Defined
A (M,A (M,nn) ) transmission schemetransmission scheme an agreed an agreed procedure of questions/answers between Alice procedure of questions/answers between Alice and Boband Bob
A A reliable reliable scheme scheme After After nn seconds the message seconds the message is correctly decoded by Bobis correctly decoded by Bob
If for If for any any nn there is a (M,n) reliable scheme with there is a (M,n) reliable scheme with rate R rate R we say we say R is AchievableR is Achievable
CapacityCapacity C(C(pp)) Maximal achievable rate Maximal achievable rate C(C(00) = ?) = ?
2log2lim nR
n
MR M
n
Capacity BehaviorCapacity Behavior
ClaimClaim: Two messages can always be : Two messages can always be correctly decoded for correctly decoded for p < ½p < ½
Proof:Proof: Message is S {Message is S {1,21,2}} Alice says:Alice says:
Yes Yes n n times for S=times for S=11 No No n n times for S=times for S=22
How will Bob decode?How will Bob decode? Using a Using a Majority Rule Majority Rule Always correct Always correct
Rate for two messagesRate for two messages
Corollary:Corollary: Can transmit with Rate zero Can transmit with Rate zero for for p p < ½ < ½ (even without feedback…)(even without feedback…)
2log 2 0R n
Capacity BehaviorCapacity Behavior ClaimClaim: C(: C(pp)= )= 00 for for p ≥ ⅓. p ≥ ⅓. Proof: Proof: No reliable three messages scheme No reliable three messages scheme
exists exists Rate > Rate > 00 is not achievable is not achievable Assume Assume p = ⅓p = ⅓, , n = 3E+1 n = 3E+1 secondsseconds Message is S {1,2,3}Message is S {1,2,3} General strategy: Ask if S=1,2 or 3General strategy: Ask if S=1,2 or 3 Bob Counts “negative votes” against possible Bob Counts “negative votes” against possible
messagesmessages S has votes as the number of liesS has votes as the number of lies
Optimal DecisionOptimal Decision: Bob Chooses message with : Bob Chooses message with least votes (why?)least votes (why?)
Success:Success: Only S has E (~ Only S has E (~ ⅓⅓n) votes or less (why?)n) votes or less (why?)
Capacity Behavior – Cont. Capacity Behavior – Cont. Charlie’s strategyCharlie’s strategy: Cause two messages to : Cause two messages to
have E votes or lesshave E votes or less First – Vote against the single messageFirst – Vote against the single message When a message accumulates When a message accumulates E +1E +1 votes it votes it
is “out of the race”is “out of the race” If not - all messages have E votes or less… If not - all messages have E votes or less…
Now – always vote against the message Now – always vote against the message with the least voteswith the least votes
ResultResult: Charlie Always votes against only : Charlie Always votes against only one competitive messageone competitive message
Capacity Behavior – Cont.Capacity Behavior – Cont.
Total # of votes against competitive messages:Total # of votes against competitive messages:
Before the 3Before the 3rdrd message was “out” both competitive message was “out” both competitive messages had no more than E votesmessages had no more than E votes
After That, they are “balanced” and their sum After That, they are “balanced” and their sum cannot exceed 2Ecannot exceed 2E
Conclusion:Conclusion: Both messages have no more than E Both messages have no more than E votes each votes each Cannot separate them ! Cannot separate them !
QEDQED
1 3 1 1 2vN n E E E E
Capacity Bounds [Berlekamp’64]Capacity Bounds [Berlekamp’64]
2 2log 1 log 1bh p p p p p The Entropy Function:
C p
Our ResultOur Result C p
When fraction of lies When fraction of lies is unknown in advance, is unknown in advance,
Capacity is zero classicallyCapacity is zero classicallyBut we can get a positive Rate!But we can get a positive Rate!
Result’s PropertiesResult’s Properties
No need to know fraction of lies (errors) in No need to know fraction of lies (errors) in advance advance
Constructive – A specific transmission scheme is Constructive – A specific transmission scheme is introducedintroduced
Variable RateVariable Rate – Better channel, higher Rate – Better channel, higher Rate Attains optimal Rate (not elaborated)Attains optimal Rate (not elaborated) PenaltyPenalty – Negligible error probability, goes to zero – Negligible error probability, goes to zero
with increasing nwith increasing n Key Idea – Key Idea – Randomization to mislead Charlieto mislead Charlie
Taking a Hard Turn…Taking a Hard Turn…
Message Point RepresentationMessage Point Representation
A message is a bit-stream A message is a bit-stream bb11,b,b22,b,b33,….,….
Can also be represented by a pointCan also be represented by a point Start with the Start with the Unit Interval [0,1) If If bb11=0 take [0,½) , Otherwise take Otherwise take [½,1)
Assume Assume bb11=0: If If bb22=0 take take [0, ¼)
Otherwise take Otherwise take [¼,½)
The finite bit-stream The finite bit-stream bb11,b,b22,b,b33,…,b,…,bk k is represented by is represented by
a a binary interval of length of length 2-k
The infinite bit-stream is represented by a The infinite bit-stream is represented by a message point ω = 0. b1b2b3….
Transmission of a Message PointTransmission of a Message Point
First assume no lies (errors)First assume no lies (errors) Message point can be any point in Message point can be any point in [0,1) Assume Assume ω < ½ Alice transmits a zero Alice transmits a zero
Otherwise, transmits a oneOtherwise, transmits a one
Now Bob knows Now Bob knows ω resides in resides in [0,½) If If ω is in is in [0, ¼) transmit another zero transmit another zero If If ω is in is in [¼,½) transmit a one transmit a one In fact, Alice transmits the message bits…In fact, Alice transmits the message bits…
Now with Lies…Now with Lies…
Let Let p p be the be the precise precise fraction of liesfraction of lies Assumption I: Assumption I: we know we know pp ((and also and also p < p < ½) If If ω < ½ Alice transmits a zero Alice transmits a zero
Otherwise, transmits a oneOtherwise, transmits a one
Bob thinks Bob thinks ω is “more likely” to be in is “more likely” to be in [0,½), , but but [½,1) is also possible… is also possible…
How can that notion be quantified ? How can that notion be quantified ? What should Alice transmit next? What should Alice transmit next?
Message Point DensityMessage Point Density
We define a We define a density functiondensity function over the unit over the unit interval interval
The density function describes our The density function describes our level of level of confidence confidence (at time (at time kk) ) of the various possible of the various possible message point positionsmessage point positions
We require We require for all for all kk
Alice steers Bob in the direction of Alice steers Bob in the direction of ω Bob gradually zooms in on Bob gradually zooms in on ω Based on a scheme for a different setting by Based on a scheme for a different setting by
[Horstein’63] [Horstein’63]
kf
1
0
1kf d
Start with a uniform densityStart with a uniform density
aa0 0 is the is the median point median point of of 0f
- Density given the received bit- Density given the received bit
aa1 1 is the is the median point median point of of 1f 1f
- Density given the two received bits - Density given the two received bits
aa2 2 is the is the median point median point of of
2f 2f
- Density given the three received bits - Density given the three received bits
aa3 3 is the is the median point median point of of
3f 3f
Hopefully after a long time… Hopefully after a long time…
Things to be noted…Things to be noted…
After After k k iterations iterations k+1 k+1 intervals within each intervals within each is constant is constant
ω lies in one of them, the lies in one of them, the message intervalmessage interval . . is multiplied by is multiplied by 2p2p if an error if an error
occurred at time occurred at time kk Multiplied by Multiplied by 2(1-p)2(1-p) otherwise otherwise There are There are exactly exactly np np errorserrors, therefore:, therefore:
kf
kf
12 1
n pn npnf p p
Another AssumptionAnother Assumption
We Assumed we know We Assumed we know pp (Assumption I) (Assumption I) Assumption IIAssumption II – Bob knows the message interval – Bob knows the message interval
when transmission ends…when transmission ends… These assumptions will be later removedThese assumptions will be later removed If the message interval size is If the message interval size is 22-L -L then:then:
22 1 logLn nf L f
1
2 2 2log 2 1 log 1 log 1n pn npL p p n n p p p p
1 bL n h p
Transmission RateTransmission Rate
Message interval size Message interval size 22-L -L bits can bits can be decoded be decoded
The bit Rate is at least The bit Rate is at least
which tends to which tends to as requiredas required
L
1 11 b
L LR h p
n n n
1 bh p
Assumption I - RemovedAssumption I - Removed
p p is unknownis unknown But Alice knows But Alice knows p p at the end !at the end ! IdeaIdea – Use an estimate – Use an estimate for for p, p, based on what Alice based on what Alice
observed so farobserved so far Define a Define a noise sequencenoise sequence
A reasonable estimate is the noise sequence’s A reasonable estimate is the noise sequence’s empirical probability :empirical probability :
Bias needed for uniform convergenceBias needed for uniform convergence
ˆ kp
11
12
ˆ1
k
jj
k
z
pk
1 Charlie lied at time k
0 Otherwisekz
This probability estimation is the This probability estimation is the KT estimate KT estimate [KrichvskyTrofimov’81][KrichvskyTrofimov’81]
Using the KT estimate we getUsing the KT estimate we get
By KT estimate properties we getBy KT estimate properties we get
Which results in RateWhich results in Rate
So asymptotically, we loose nothing !So asymptotically, we loose nothing !
1
1
ˆ ˆ2 1k
k
n zznn k k
k
f p p
2 2
1log 1 log 1
2n bL f n h p n
24 log11 1
2b bn
nLR h p h p
n n
Assumption I* Added…Assumption I* Added…
We made an absurd assumption here – Did you We made an absurd assumption here – Did you notice?notice?
Bob (receiver) must know as well !Bob (receiver) must know as well ! Equivalent to knowing the noise sequence…Equivalent to knowing the noise sequence…
Assumption I*:Assumption I*: can be updated once per can be updated once per BB seconds (still needs explaining..)seconds (still needs explaining..)
B=B(n) B=B(n) is called is called the block sizethe block size, may depend on , may depend on nn
It can be shown thatIt can be shown that
So we require So we require
ˆ kp
ˆ kp
2log11 b
B n nLR h p K
n n
2log0
n
B n n
n
Update Information (UI)Update Information (UI)
AssumeAssume seconds seconds UI elements: UI elements:
# of ones in the noise sequence in the last block # of ones in the noise sequence in the last block options options bits bits
Current message interval Current message interval options options bits bits Must provide Bob with UI once per blockMust provide Bob with UI once per block UI is aboutUI is about bits per seconds bits per seconds
Therefore, UI Rate is (key point!!)Therefore, UI Rate is (key point!!)
B n n
23 log2 0UI n
nR
n
n 2 21log log2n n
n 2log n
23 log2 n n
2log0
n
n
n
IF Alice can reliably IF Alice can reliably convey UI to Bob thenconvey UI to Bob then
We are done!We are done!
Reliable UI – Is That Possible?Reliable UI – Is That Possible?
Old ProblemOld Problem: Charlie may corrupt UI…: Charlie may corrupt UI… Different from the original problem?Different from the original problem?
Yes - UI Rate approaches zero !Yes - UI Rate approaches zero ! Remember, Rate zero can be attained for Remember, Rate zero can be attained for p < ½ !p < ½ !
Solution’s OutlineSolution’s Outline: : Random positions Random positions per block are agreed via feedbackper block are agreed via feedback Bob Estimates if Bob Estimates if p < ½ or p >½ p < ½ or p >½ in each block:in each block:
Alice transmits “all zeros” over random positionsAlice transmits “all zeros” over random positions Bob finds fraction of ones receivedBob finds fraction of ones received
Alice transmits UI over random positions per block Alice transmits UI over random positions per block Alice Alice repeats each UI bit repeats each UI bit several times several times Bob decodes each bit by majority/minority ruleBob decodes each bit by majority/minority rule ““Bad blocks” (Bad blocks” (p ~ ½)p ~ ½) are thrown away are thrown away
Reliable UI – Cont.Reliable UI – Cont.
PenaltyPenalty: Bad estimate : Bad estimate Error ! Error ! Can show that error probability tends to zeroCan show that error probability tends to zero
Throwing “Bad blocks” Throwing “Bad blocks” Random Rate Random Rate Probability of throwing a good block is smallProbability of throwing a good block is small Rate approachingRate approaching is attained with is attained with
probability probability
0ne np e
1 bh p
1 1nn
e
SummarySummary
Ulam’s game introduced Ulam’s game introduced Analogy to communications with adversary Analogy to communications with adversary
and feedback and feedback Classical results presentedClassical results presented Can do much better with randomization!Can do much better with randomization!
Higher RateHigher Rate Rate Adaptive to channel (Charlie) behaviorRate Adaptive to channel (Charlie) behavior Penalty – Vanishing error probabilityPenalty – Vanishing error probability
Further ResultsFurther Results
Much higher Rates possible using structure in the Much higher Rates possible using structure in the noise sequence (Charlie’s strategy)noise sequence (Charlie’s strategy) ExampleExample: Assume Charlie lies and tells the truth : Assume Charlie lies and tells the truth
alternately alternately so our scheme attains Rate so our scheme attains Rate
zerozero But Alice can notice this “stupid” strategy !But Alice can notice this “stupid” strategy ! Alice can lie in purpose to “cancel “ Charlie’s liesAlice can lie in purpose to “cancel “ Charlie’s lies Related to universal prediction and universal Related to universal prediction and universal
compression (Lempel-Ziv) of individual sequencescompression (Lempel-Ziv) of individual sequences Generalizations to multiple-choice questionsGeneralizations to multiple-choice questions
1 1 02 bp h p
Thank You!Thank You!