Isolation Technique April 16, 2001 Jason Ku Tao Li.

37
Isolation Technique April 16, 2001 Jason Ku Tao Li

Transcript of Isolation Technique April 16, 2001 Jason Ku Tao Li.

Isolation TechniqueApril 16, 2001

Jason Ku

Tao Li

Outline

1) Show that we can reduce NP, with high probability, to US. That is: NP randomized reduces to detecting unique solutions.

2) PH PPP

Isolation Lemma

• Definitions

• Isolation Lemma

• Example of using Isolation Lemma

Definition of weight functions

A weight function W, maps a finite set U+

For a set SU, W(S)=xSW(x)

Let F be a family of non-empty subsets of U.A weight function W is “good for” F if there is a unique minimum weight set in F, and “bad for” F otherwise.

Ex: let U={u1, u2, u3}, let F ={(u1), (u2), (u3), (u1u2)}

define W1(u1)=1 W2(u1)=1

W1(u2)=2 W2(u2)=1

W1(u3)=3 W2(u3)=2

W1 is good for F while W2 is bad for F.

Isolation Lemma

Let U be a finite set

Let F1, F2, …Fm be families of non-empty subsets of U

Let D = ||U||

Let R > mD

Let Z be a set of weight functions s.t. the weight of any individual element of U is at most R

Let , 0 < < 1, be s.t. > mD/R

Then, more than (1- )||Z|| weight functions are good for all of F1, F2, …Fm.

Proof of Isolation Lemma

Proof sketch:

By definition, a weight function W is bad if there are at least 2 different minimum weight sets in F.

Let S1 and S2 be 2 different sets with the same minimum weights, then xS1 s.t. xS2. Call x ambiguous.

If we know the weights of all other elements in U, either x is unambiguous, or there is one specific weight for x that makes x ambiguous.

Lets Count

So, for an xU, there are at most RD-1 weight functions s.t. x is ambiguous.

There are RD weight functions, m choices for F and D choices for x. Thus the fraction of weight functions that are bad for Fi is at most mDRD-1/RD = mD/R < . So the fraction of weight functions good for Fi is 1- .

Example of Isolating Lemma

Let U={u1, u2, u3}D=3

Let F1={(u1), (u1,u3), (u1,u2,u3), (u2)}m=1R = 4 > mD = 3||Z|| = 64

Then at least (1 – ¾)64 = 16 weight functions are good for F.W1(u1)=1 W2(u1)=2 W3(u1)=1 W4(u1)=1W1(u2)=2 W2(u2)=3 W3(u2)=2 W4(u2)=3W1(u3)=3 W2(u3)=4 W3(u3)=2 W4(u3)=36 variations 6 variations 3 variations 3 variations18 variations, and more.

Definition of US

US = {L | ( NPTM M) (x) x L #accM(x)=1}

NP randomized reduces to US

NP RPUS

Proof Map:

1) Definitions I, II

2) Apply Isolation Lemma to get a probability

3) Construct an oracle B US

4) Construct a machine N that uses oracle B

5) show N RPUS

6) Show x L NP implies x L(N) RPUS

Definitions I

Let A = {<x,y> | NPTML(x) on path y accepts}

for L NP, a polynomial p, s.t. x*, xL at least 1 y, |y| = p(|x|), s.t. <x,y>A

Encode y as follows:

y = y1y2…yn = {i | 1|i| p(n) yi = 1}

ex: y = 1001 = {1, 4}(1 take right branch, 0 take left branch)

Definitions II

Let U(x) = {1, 2, …, p(|x|)}

D = ||U|| = p(|x|)Let F(x) = y s.t. <x, y>A (collection of accepting paths)

m = 1

Let Z(x) = weight functions that assign weights no greater than 4p(|x|)

R = 4p(|x|)

Applying Isolation Lemma

By the Isolation Lemma:

if xL, 3/4 of weights functions assigns F a unique minimum weight set

if xL, there are no accepting paths yF so zero weight functions are assigns F a unique minimum weight set

Construct an oracle BUS

Let B = {<x, W, j> | WZ(x), 1 j p2(|x|), and a unique y F s.t. W(y) = j}

NPTM MB on input u:

1) if u is not of the form <x,W,j> reject

2) else, using p(|x|) non-deterministic moves, selects y and accepts u <x,y>A and W(y)=j.

Why BUS

If uB, there is a unique path y F s.t. W(y)=j. Thus machine MB will only accept once.

If uB, there are either zero, or more than 1 yF s.t. W(y)=j. Thus machine MB will have either zero, or more than 1 accepting path.

Construct an RP machine with oracle B

NPTM N on input x:

1) Create random weight functions W properly bounded.

2) For each j, 1 j 4p2(|x|), ask oracle B if <x, W, j> B. If yes, accept. If no, reject.

NRPUS and xL high probability xN

For every x*,

- If xL, MB on <x, W, j> accepts with probability ¾, so N accepts with probability ¾.

- If xL, MB on <x, W, j> rejects with probability 1, so N rejects with probability 1.

So,

- xL high probability xN

- Since xL implies (N, x) = ¾ > .5 acceptance, and xL implies (N, x) = 1 rejecting, NRP

Definition of P and #P

P = {L | ( NPTM M) (x) xL #accM(x) is odd}

#P ={f | ( NPTM M) (x) f(x) = #accM(x) }

Toda’s Theorem PH PPP

Three major parts to prove it:

• (Valiant&Vazrani) NP BPPP

• Theorem 4.5

• Lemma 4.13

PPP PPP, hence BPPP PPP

PPp

kBPPPHeiBPPk .,.,,1

(Valiant&Vazrani) NP BPPP

Proof:

• Let A NP, A = L(M) and M runs in time p(|x|).

• Let B={(x,w,k): M has an odd # of accepting paths on input x having weight k}, w:{1,…,p(|x|)}----{1,…,4p(|x|)},

B P

(Valiant &Vazrani) Cont.• For a BPPP algorithm, consider

On input x

Randomly pick w

for k:=1 to 4p2(|x|)

ask if (x,w,k) is in B

if so, then halt and accept

end-for

if control reaches here, then halt and reject

Note

• Valiant &Vazrani Theorem is relativizable.

In other words, we have

NPA = BPPPA

for every oracle A

Theorem 4.5 PHBPPP

We prove it by induction

Three steps for induction step:

• Apply Valiant & Vazrani to the base machine

• Swap BPP and P in the middle

• Collapse BPPBPP BPP, PP P

Step 1 for Thm. 4.5

• Induction hypothesis:

• Since

NPA = BPPPA for every oracle A,

Hence,

PBPPPp

k PBPPp

kBPPNPNP

1

Pp

kBPP

Step 2: Swapping

• By lemma 4.9

PBPPA BPPPA

Hence

PPBPPP

kBPP

1

Step 3: Collapse

• Proposition 4.6: BPPBPPA = BPPA

• Proposition 4.8: PP = P

• Hence

PBPPBPPP

kBPPBPPBPP

PPP

1

Toda’s Theorem

• Proposition 4.15

PPP = P#P

• Toda’s Theorem: PP is Hard for the polynomial Hierarchy

PH PPP = P#P

Proof for BPPP P#P

• Let A BPPP, where A is accepted by MB

and let f be the #P function for B. Let nk be the running time of M.

• Assume first that M makes only one query along any path. Then let g(x,y) be a #P function that is defined to be the number of accepting paths of the following machine:

Proof cont. 1On input x,y run M(x) along path y when a query “w is in B?” is made then flip a coin c in {0,1} and use this as the oracle answer and

continue simulating M(x)if the simulation accepts, then generate f(w)

+(1-c) paths and accept

Proof Cont. 2

• g(x,y) is odd if and only if MB (x) accepts along y

• For g(x,y), consider a #P function g’(x,y) such that :

g(x,y) is odd, then g’(x,y) =1(mod 2^nk )

g(x,y) is even, then g’(x,y) = 0(mod 2^nk )

• Define h(x)= yyxg ),('

Proof Cont.3

• The value h(x) (mod 2^nk )represents the number of y’s such that MB (x) accepts along path y

• Our P#P algorithm: on input x, using the oracle h(x), decides if the following holds:

• if so, x is accepted, and if not x is rejected

)2(mod24

3)(

kk nnxh

Proof Cont. 4If M makes more than one query, modify g(x,y) as follows:

on input x,y

repeat

run M(x) along with path y

when a query “w is in B?’’ is made

then flip a coin c in {0,1} and generate f(w)+(1-c) paths

use c as the oracle answer and continue simulating M(x)

until no more queries are asked;

if the simulation of M(x) along path y accepts with this sequence of guessed oracle queries

then accept

else reject

Proof Cont. 5

• Call the above machine as N

• Claim : MB accepts x along y if and only if #accN(x,y) = g(x,y) is odd

Fact 1

• Let k in N, f in #P, then there exists g in #P such that

f(x) is odd then g(x) = 1 (mod 2^nk )

f(x) is even than g(x) = 0 (mod 2^nk )

Fact 2• Let f(x,y) be a #P function, then

• Let M be the machine such that #accM(x,y)=f(x,y). Consider the following machine M’:

on input x

compute |x|k

guess y of length |x|k

run M(x,y)

• g(x)= #accM’ ( x,y)

kxyy

Pyxfxg||||,

#),()(

Discussions

• UL/Poly = NL/Poly

• ? UL= NL

• ? UP = NP

• NPPSPACE = UPPSPACE = PSPACE

• There is an oracle relative to which NP<>UP.

Conclusions

We’ve shown, by use of the isolation lemma, that NP RPUS BPPP.

This was the base case of an inductive proof to show PH BPPP.

From there we extended to Toda’s theorem: PH PPP = P#P.