Estimation from Quantized Signals

Post on 09-Jan-2016

40 views 0 download

Tags:

description

Estimation from Quantized Signals. Cheng Chang. Outline of the talk. Decentralized Estimation Model of Random Quantization Non-isotropic Decentralized Quantization Isotropic Decentralized Quantization Conclusions. Decentralized Estimation from Quantized Signals. - PowerPoint PPT Presentation

Transcript of Estimation from Quantized Signals

Estimation from Quantized Signals

Cheng Chang

Outline of the talk

• Decentralized Estimation

• Model of Random Quantization

• Non-isotropic Decentralized Quantization

• Isotropic Decentralized Quantization

• Conclusions

Decentralized Estimation from Quantized Signals

Model of Random Quantization

What is a quantizer?

A nonlinear system whose purpose is to transform the input sample into one of a finite set of prescribed values. [Oppenheim and Schafer]

is a random variable in RL , in this talk, always has a FINITE support set.

Model of Random Quantization

.

Model of Random Quantization

Definition of random quantization: A map from a subspace (support set of ) in RL to the M dimensional probability simplex.

M is the size of the output set.

Estimation is needed in the fusion center.

Deterministic quantizations and non-subtract ditherings are subsets of random quantization.

Model of Random Quantization

L=1, M=3

Model of Random Quantization

(N,M) quantizer-network : N independent (not necessarily identical) quantizers , each one has M quantization levels.

• Lemma1 :Optimal (1,M) quantizer-network is deterministic. And it exists.

• How to find it is another story which is not in this talk’s scope.

• Lemma2: For any (N,M) quantizer-network , there is an equivalent (same input, same output) (1,MN) quantizer-network. – (N,M) network can not do better than the optimal

(1,MN) quantizer

Non-Isotropic Quantization

• Def: Sensors can be different things, meanwhile the sensors send their IDs to the fusion center.

• Theorem1: There exists a (N,M) non-isotropic quantizer-network which can do as good as the optimal (1, MN) quantizer (deterministic).– Proof: There is a bijective map from the set of

deterministic non-isotropic (N,M) quantizer-network to the set of deterministic (1, MN) quantizers.

• The ith sensor sends the ith bit of the output of the (1, MN) quantizer.

Non-Isotropic Quantization

• Example: N=3, M=2, L=1

Isotropic Quantizer Network (IQN)

• Def: Every single sensor is doing exactly the same thing . No ID is needed. – Every sensor has the same map FM from the

parameter space to the probability simplex.

• (N,M, FM ) IQN

– Sensors all use the same quantization map FM

Isotropic Quantizer Network (IQN)

• Example : N=3, M=2. (1 1 0)=( 1 0 1) =(0 1 1), (0 0 0 ) , (1 1 1 ) , (1 0 0)=(0 1 0) =(0 0 1), 4 possible outputs instead of 8 (non-isotropic).

• Let K(N,M) be the number of possible outputs of an (N,M) IQN.

Isotropic Quantizer Network (IQN)

• Lemma 3: K(N,M)=

– Proof: K(N,M) = the number of the solutions of the non-negative integer equation : a1+a2+…+aM=N

• A (N,M) IQN can not work better than the optimal (1,K(N,M)) quantizer. (Lemma2)

)(1

1 1 MTNM

MN M

Isotropic Quantizer Network (IQN)

• A map FM is asymptotically better than map GM ,iff there exists V, s.t. (N,M, FM ) is better than (N,M, GM ) for all N>V.

• Criteria for better: MSE,…

Isotropic Quantizer Network (IQN)

• Lemma4 (Sanov’s theorem): Let X1, X2,…XN be i.i.d ~ Q(X). Let E be a set of probablity distributions. Then

• Crucial KL distance- 1/N

)||(minarg

2)1()(*

)||( *

QPDP

NEQ

EP

QPNDMN

Isotropic Quantizer Network (IQN)

• Let H(M)=– {Measurable function from R to the M-dimensional

probability simplex, s.t. there are only finite discontinuous points}

• Theorem2 : L=1, M>2, for any FM in H(M), there exists GM in H(M), which is asymptotically better than FM

• Proof: Lemma4 and the fact that the “topologies” are the same for Euclidean metric and KL(Kullback Leibler)- distance.

Isotropic Quantizer Network (IQN)

• Reason: H(M) is not complete.• Stronger statement may exist.• Can be generated to higher dimensional

cases (L>1).– “If L<M-1, and the map is not weird….”

» Need help from Evans Hall.

Isotropic Quantizer Network (IQN)

• Theorem3 : Fix M, (N,M) IQN can do at least as good as the optimal (1, B(M) NM/2) quantizer asymptotically with respect to N.– Proof: Construction: pack (M-1)-dimensional

balls of volume N -(M-1)/2 into the M-dimensional probablity simplex .

• M-dimensional simplex has volume A(M).

• “Radius” of the balls is R(M)N -1/2 i

Isotropic Quantizer Network (IQN)

Isotropic Quantizer Network (IQN)

•Crucial KL radius – N-1

•Equivalent Euclidean radius- N-1/2

•Taylor expansion of KL distance.

Isotropic Quantizer Network (IQN)

• Conjecture : Fix M, (N,M) IQN cannot do better than the optimal (1, D(M) NM/2) quantizer asymptotically with respect to N.

Conclusions

• Quantization :a map from a space to the probability simplex. (is this new?)

• Non-isotropic (N,M) quantizer-network = quantizer with MN quantization levels (is it trivial?)

• Isotropic (N,M) quantizer-network can work as good as a quantizer with N(M-1)/2 quantization levels asymptotically. (converse?).

In the report

• Noisy case , each observation is truncated by an I.I.D r.v.– the reason why (N,M) is more preferable than

(1, MN).– If Nlg(M) is constant, what is the best choice of

N?

In the report

• A linear universal (unknown noise) isotropic decentralized estimation scheme (based on dithering) :

NM

T

NM

ME

222 1

))ˆ((

The End………………..

Thank you!

Q/A

• (quantization, “probability simplex”)16 entries from Google

• Definition of triviality.

• I hope so… more in report