Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture...
Transcript of Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture...
![Page 1: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/1.jpg)
Richard Baraniuk
Rice Universitydsp.rice.edu/cs
Lecture 3:CompressiveClassification
![Page 2: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/2.jpg)
Compressive Sampling
![Page 3: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/3.jpg)
Signal Sparsity
widebandsignalsamples
largeGabor (TF)coefficients
Fourier matrix
![Page 4: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/4.jpg)
Compressive Sampling
• Random measurements
measurements signal
sparsein basis
![Page 5: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/5.jpg)
Compressive Sampling
• Universality
![Page 6: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/6.jpg)
Why Does It Work? (1)• Random projection not full rank, but stably embeds
– sparse/compressible signal models (CS) – point clouds (JL)
into lower dimensional space with high probability• Stable embedding: preserves structure
– distances between points, angles between vectors, …
provided M is large enough: Compressive Sampling
K-dim planes
K-sparsemodel
![Page 7: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/7.jpg)
Why Does It Work? (2)• Random projection not full rank, but stably embeds
– sparse/compressible signal models (CS) – point clouds (JL)
into lower dimensional space with high probability• Stable embedding: preserves structure
– distances between points, angles between vectors, …
provided M is large enough: Johnson-Lindenstrauss
Q points
![Page 8: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/8.jpg)
Information Scalability
• In many CS applications, full signal recovery may not be required
• If we can reconstruct a signal from compressive measurements, then we should be able to perform other kinds of statistical signal processing:
– detection– classification– estimation …
![Page 9: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/9.jpg)
CompressiveDetection/Classification
![Page 10: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/10.jpg)
Classification of Signals in Noise
• Observe one of P known signals in noise
• Probability density function of the noiseex: zero mean, white Gaussian noise
(AWGN)
• Probability density function of signal
>>> mean shifted to
![Page 11: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/11.jpg)
Multiclass Likelihood Ratio Test (LRT)
• Observe one of P known signals in noise
• Classify according to:
• AWGN: nearest-neighbor classification
• Sufficient statistic:
![Page 12: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/12.jpg)
Compressive LRT
• Compressive observations:
[Waagen et al 05, Davenport et al 06, Haupt et al 06]
by the JL Lemmathese distancesare preserved
![Page 13: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/13.jpg)
Compressive LRT
• Compressive observations:
• If are normalized,
these anglesare preserved
![Page 14: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/14.jpg)
Performance of Compressive LRT
• ROC curve for Neyman-Pearson detector (2 classes):
• From JL lemma, for random orthoprojector
• Thus
• Penalty for compressive measurements:
false alarmprobability
![Page 15: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/15.jpg)
Performance of Compressive LRT
fewer measurements
better SNR
![Page 16: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/16.jpg)
Summary
• If CS system is a random orthoprojector, then detection/classification problems in the signal space map into analogous problems in the measurement space
• SNR penalty for compressive measurements
• Note: Signal sparsity unexploited!
![Page 17: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/17.jpg)
Matched Filtering
![Page 18: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/18.jpg)
Matched Filter
• In many applications, signals are transformedwith an unknown parameter; ex: translation
• Elegant solution: matched filterCompute
for all
• Simultaneously: estimates parameterclassifies signal
convolution of measurementwith template reversed in time
![Page 19: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/19.jpg)
Compressive Matched Filter
• Challenge: Extend matched filter concept to compressive measurements
• GLRT:
• GLRT approach extends to any case where each class can be parameterized with K parameters
• If mapping from parameters to signal is well-behaved, then each class forms a manifold in
![Page 20: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/20.jpg)
Signal Manifolds
![Page 21: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/21.jpg)
What is a Manifold?
“Manifolds are a bit like pornography: hard to define, but you know one when you see one.”
– S. Weinberger [Lee]
• Locally Euclidean topological space
• Roughly speaking:– a collection of mappings of open sets of RK glued together (“coordinate charts”)
– can be an abstract space, not a subset of Euclidean space• e.g., SO3, Grassmannian
• Typically for signal processing: – nonlinear K-dimensional “surface” in signal space RN
![Page 22: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/22.jpg)
Examples
• Circle in RN
– parameter: angle
• Chirp in RN
– parameters: start frequencyend frequency
• Image appearance manifold– parameters:
position of object, camera, lighting, etc.
![Page 23: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/23.jpg)
Object Rotation Manifold
K=1
each imageis a pointin RN
![Page 24: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/24.jpg)
Up/Down Left/Right Manifold
[Tenenbaum, de Silva, Langford]
K=2
![Page 25: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/25.jpg)
ISOMAP HLLELaplacian
Eigenmaps
R4096
Manifold Learning from Training Data
• Translating diskparameters: left/right, up/down shift (K=2)
• Generate training data by sampling from the manifold
• “Learn” the structure of the manifold
![Page 26: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/26.jpg)
Manifold Classification
![Page 27: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/27.jpg)
Manifold Classification
• Now suppose data is drawn from one of P possible manifolds:
• AWGN: nearest manifold classification
M1
M2M3
![Page 28: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/28.jpg)
Compressive Manifold Classification
• Compressive observations:
• Good news: structure of smoothmanifolds is preserved by randomprojection provided
– distances, geodesic distances, angles, …
[Wakin et al, 06, Haupt et al 07]
![Page 29: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/29.jpg)
Aside:Random Projectionsof Smooth Manifolds
![Page 30: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/30.jpg)
Stable Manifold EmbeddingTheorem:
Let F ⊂ RN be a compact K-dimensional manifold with
– condition number 1/τ (curvature, self-avoiding)
– volume V
Let Φ be a random MxN orthoprojector with
[Wakin et al 06, Haupt et al 07]
Then with probability at least 1-ρ, the followingstatement holds: For every pair x,y ∈ F
![Page 31: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/31.jpg)
Stable Manifold Embedding
• Theorem tells us that random projectionspreserve smooth manifold
– dimensionality– ambient distances– geodesic distances– local angles– topology– local neighborhoods– Volume
• Also there exists extension to some kinds of non-smooth manifolds
![Page 32: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/32.jpg)
Manifold Learning from Compressive Measurements
ISOMAP HLLELaplacian
Eigenmaps
R4096
RM
M=15 M=15M=20
![Page 33: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/33.jpg)
Let Φ be a random MxN orthoprojector with
Multiple Manifold Embedding
Corollary:Let M1, … ,MP ⊂ RN be compact K-dimensional manifolds with
– condition number 1/τ (curvature, self-avoiding)
– volume V
– min dist(Mj,Mk) > τ (can be relaxed)
Then with probability at least 1-ρ, the followingstatement holds: For every pair x,y ∈ U Mj
![Page 34: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/34.jpg)
Compressive Manifold
Classification
![Page 35: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/35.jpg)
Compressive Manifold Classification
• Compressive observations:
• Good news: structure of smoothmanifolds is preserved by randomprojection provided
– distances, geodesic distances, angles, …
[Wakin et al, 06, Haupt et al 07]
![Page 36: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/36.jpg)
Smashed Filter• Compressive manifold classification with GLRT
– nearest-manifold classifier based on manifolds
M1
M2M3
Φ M1
Φ M2Φ M3
[Davenport et al 06, Healy and Rohode 07]
![Page 37: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/37.jpg)
Smashed Filter – Experiments
• 3 image classes: tank, school bus, SUV
• N = 65,536 pixels
• Imaged using single-pixel CS camera with– unknown shift– unknown rotation
![Page 38: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/38.jpg)
Smashed Filter – Unknown Position
• Object shifted at random (K=2 manifold)• Noise added to measurements• Goal: identify most likely position for each image class
identify most likely class using nearest-neighbor test
number of measurements Mnumber of measurements M
avg.
shift
estim
ate
erro
r
clas
sifica
tion r
ate
(%)
more noise
more noise
![Page 39: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/39.jpg)
Smashed Filter – Unknown Rotation
• Object rotated each 10o
• Goals: identify most likely rotation for each image classidentify most likely class using nearest-neighbor test
• Perfect classification withas few as 6 measurements
• Good estimates of rotation with under 10 measurements
number of measurements M
avg.
rot.
est
. er
ror
![Page 40: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/40.jpg)
Summary
• Compressive measurements areinformation scalable
reconstruction > estimation > classification > detection
• Random projections preserve structure of smooth manifolds (analogous to sparse signals)
• Smashed filter: dimension-reduced GLRT for parametrically transformed signals– exploits compressive measurements and manifold structure– broadly applicable: targets do not have to have sparse
representation in any basis– effective for detection/classification
![Page 41: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/41.jpg)
Open Issues
• Compressive classification does not exploit sparse signal structure to improve performance
• Non-smooth manifolds and local minima in GLRT– one approach: multiscale random projections
• Experiments with real data
![Page 42: Lecture 3: Compressive Classification · Richard Baraniuk Rice University dsp.rice.edu/cs Lecture 3: Compressive Classification](https://reader034.fdocuments.us/reader034/viewer/2022042812/5facd78a53868a71403f16d3/html5/thumbnails/42.jpg)
Some References
• R. G. Baraniuk, M. Davenport, R. DeVore, M. Wakin, “A simple proof of the restricted isometry property for random matrices,” to appear in Constructive Approximation, 2007.
• R. G. Baraniuk and M. Wakin, “Random projections of smooth manifolds,”2006; see also ICASSP 2006.
• M. Davenport, M. Duarte, M. Wakin, J. Laska, D. Takhar, K. Kelly, R. G. Baraniuk, “The smashed filter for compressive classification and target recognition,” Proc. of Computational Imaging V at SPIE Electronic Imaging, San Jose, California, January 2007.
• M. Wakin, D. Donoho, H. Choi, R. G. Baraniuk. “The multiscale structure of non-differentiable image manifolds,” Proc. Wavelets XI at SPIE Optics and Photonics, 2005.
• J. Haupt, R. Castro, R. Nowak, G. Fudge, A. Yeh, “Compressive sampling for signal classification,” Proc. Asilomar Conference on Signals, Systems, and Computers, 2006.
• D. Healy, G. Rohode, “Fast global image registration using random projections, 2007.
for more, see dsp.rice.edu/cs