Estimation from Quantized Signals

26
Estimation from Quantized Signals Cheng Chang

description

Estimation from Quantized Signals. Cheng Chang. Outline of the talk. Decentralized Estimation Model of Random Quantization Non-isotropic Decentralized Quantization Isotropic Decentralized Quantization Conclusions. Decentralized Estimation from Quantized Signals. - PowerPoint PPT Presentation

Transcript of Estimation from Quantized Signals

Page 1: Estimation from Quantized Signals

Estimation from Quantized Signals

Cheng Chang

Page 2: Estimation from Quantized Signals

Outline of the talk

• Decentralized Estimation

• Model of Random Quantization

• Non-isotropic Decentralized Quantization

• Isotropic Decentralized Quantization

• Conclusions

Page 3: Estimation from Quantized Signals

Decentralized Estimation from Quantized Signals

Page 4: Estimation from Quantized Signals

Model of Random Quantization

What is a quantizer?

A nonlinear system whose purpose is to transform the input sample into one of a finite set of prescribed values. [Oppenheim and Schafer]

is a random variable in RL , in this talk, always has a FINITE support set.

Page 5: Estimation from Quantized Signals

Model of Random Quantization

.

Page 6: Estimation from Quantized Signals

Model of Random Quantization

Definition of random quantization: A map from a subspace (support set of ) in RL to the M dimensional probability simplex.

M is the size of the output set.

Estimation is needed in the fusion center.

Deterministic quantizations and non-subtract ditherings are subsets of random quantization.

Page 7: Estimation from Quantized Signals

Model of Random Quantization

L=1, M=3

Page 8: Estimation from Quantized Signals

Model of Random Quantization

(N,M) quantizer-network : N independent (not necessarily identical) quantizers , each one has M quantization levels.

• Lemma1 :Optimal (1,M) quantizer-network is deterministic. And it exists.

• How to find it is another story which is not in this talk’s scope.

• Lemma2: For any (N,M) quantizer-network , there is an equivalent (same input, same output) (1,MN) quantizer-network. – (N,M) network can not do better than the optimal

(1,MN) quantizer

Page 9: Estimation from Quantized Signals

Non-Isotropic Quantization

• Def: Sensors can be different things, meanwhile the sensors send their IDs to the fusion center.

• Theorem1: There exists a (N,M) non-isotropic quantizer-network which can do as good as the optimal (1, MN) quantizer (deterministic).– Proof: There is a bijective map from the set of

deterministic non-isotropic (N,M) quantizer-network to the set of deterministic (1, MN) quantizers.

• The ith sensor sends the ith bit of the output of the (1, MN) quantizer.

Page 10: Estimation from Quantized Signals

Non-Isotropic Quantization

• Example: N=3, M=2, L=1

Page 11: Estimation from Quantized Signals

Isotropic Quantizer Network (IQN)

• Def: Every single sensor is doing exactly the same thing . No ID is needed. – Every sensor has the same map FM from the

parameter space to the probability simplex.

• (N,M, FM ) IQN

– Sensors all use the same quantization map FM

Page 12: Estimation from Quantized Signals

Isotropic Quantizer Network (IQN)

• Example : N=3, M=2. (1 1 0)=( 1 0 1) =(0 1 1), (0 0 0 ) , (1 1 1 ) , (1 0 0)=(0 1 0) =(0 0 1), 4 possible outputs instead of 8 (non-isotropic).

• Let K(N,M) be the number of possible outputs of an (N,M) IQN.

Page 13: Estimation from Quantized Signals

Isotropic Quantizer Network (IQN)

• Lemma 3: K(N,M)=

– Proof: K(N,M) = the number of the solutions of the non-negative integer equation : a1+a2+…+aM=N

• A (N,M) IQN can not work better than the optimal (1,K(N,M)) quantizer. (Lemma2)

)(1

1 1 MTNM

MN M

Page 14: Estimation from Quantized Signals

Isotropic Quantizer Network (IQN)

• A map FM is asymptotically better than map GM ,iff there exists V, s.t. (N,M, FM ) is better than (N,M, GM ) for all N>V.

• Criteria for better: MSE,…

Page 15: Estimation from Quantized Signals

Isotropic Quantizer Network (IQN)

• Lemma4 (Sanov’s theorem): Let X1, X2,…XN be i.i.d ~ Q(X). Let E be a set of probablity distributions. Then

• Crucial KL distance- 1/N

)||(minarg

2)1()(*

)||( *

QPDP

NEQ

EP

QPNDMN

Page 16: Estimation from Quantized Signals

Isotropic Quantizer Network (IQN)

• Let H(M)=– {Measurable function from R to the M-dimensional

probability simplex, s.t. there are only finite discontinuous points}

• Theorem2 : L=1, M>2, for any FM in H(M), there exists GM in H(M), which is asymptotically better than FM

• Proof: Lemma4 and the fact that the “topologies” are the same for Euclidean metric and KL(Kullback Leibler)- distance.

Page 17: Estimation from Quantized Signals

Isotropic Quantizer Network (IQN)

• Reason: H(M) is not complete.• Stronger statement may exist.• Can be generated to higher dimensional

cases (L>1).– “If L<M-1, and the map is not weird….”

» Need help from Evans Hall.

Page 18: Estimation from Quantized Signals

Isotropic Quantizer Network (IQN)

• Theorem3 : Fix M, (N,M) IQN can do at least as good as the optimal (1, B(M) NM/2) quantizer asymptotically with respect to N.– Proof: Construction: pack (M-1)-dimensional

balls of volume N -(M-1)/2 into the M-dimensional probablity simplex .

• M-dimensional simplex has volume A(M).

• “Radius” of the balls is R(M)N -1/2 i

Page 19: Estimation from Quantized Signals

Isotropic Quantizer Network (IQN)

Page 20: Estimation from Quantized Signals

Isotropic Quantizer Network (IQN)

•Crucial KL radius – N-1

•Equivalent Euclidean radius- N-1/2

•Taylor expansion of KL distance.

Page 21: Estimation from Quantized Signals

Isotropic Quantizer Network (IQN)

• Conjecture : Fix M, (N,M) IQN cannot do better than the optimal (1, D(M) NM/2) quantizer asymptotically with respect to N.

Page 22: Estimation from Quantized Signals

Conclusions

• Quantization :a map from a space to the probability simplex. (is this new?)

• Non-isotropic (N,M) quantizer-network = quantizer with MN quantization levels (is it trivial?)

• Isotropic (N,M) quantizer-network can work as good as a quantizer with N(M-1)/2 quantization levels asymptotically. (converse?).

Page 23: Estimation from Quantized Signals

In the report

• Noisy case , each observation is truncated by an I.I.D r.v.– the reason why (N,M) is more preferable than

(1, MN).– If Nlg(M) is constant, what is the best choice of

N?

Page 24: Estimation from Quantized Signals

In the report

• A linear universal (unknown noise) isotropic decentralized estimation scheme (based on dithering) :

NM

T

NM

ME

222 1

))ˆ((

Page 25: Estimation from Quantized Signals

The End………………..

Thank you!

Page 26: Estimation from Quantized Signals

Q/A

• (quantization, “probability simplex”)16 entries from Google

• Definition of triviality.

• I hope so… more in report