Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern...

22
Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis (PCA) Bayes Decision Theory Gaussian Normal Density Shapiro chap Duda & Hart chap Topic 1: Principle Component Analysis
  • date post

    20-Dec-2015
  • Category

    Documents

  • view

    235
  • download

    1

Transcript of Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern...

Page 1: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1

Pattern Recognition

• Features

• Principle Components Analysis (PCA)

• Bayes Decision Theory

• Gaussian Normal Density

Shapiro chapDuda & Hart chap

Topic 1: Principle Component Analysis

Page 2: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 2

Pattern Recognition Basics

input image/s

feature extraction

feature classification

pattern recognition results

Page 3: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 3

Feature Extraction

• Linear featureseg. Lines, curves

• 2D features eg. Texture

• Feature of features

Page 4: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 4

Feature Extraction

• A good feature should be invariant to changes in

• scale• rotation• translation• illumination

Page 5: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 5

Feature Classification

• Linear Classifier

• Non-linear Classifier

• neural network

Page 6: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 6

Principle Components Analysis (Karhunen-Loeve Transformation)

Pattern Recognition

• Principle components analysis and its interpretation

• Applications

Page 7: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 7

Suppose we are shown a set of points:

Page 8: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 8

We want to answer any of the following questions:

• What is the trend (direction) of the cloud of points

• Which is the direction of maximum variance (dispersion)

• Which is the direction of minimum variance (dispersion)

• Supposing I am only allowed to use a 1D description for this set of 2D points, i.e. using 1 coordinate instead of 2 coordinates (x,y). How should I represent the points in such a way that the overall error is minimized ? --- this is data compression problem

All these questions can be answered using Principle Components Analysis (PCA)

Page 9: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 9

Principal Component Analysis (PCA)

},,,,{ 321 Nxxxx

N

iNN 1

1xx

N

i

TC1

yy

iiiC ee

where the ie is a principal component with eigenvalue i

Suppose we have N feature vectors

We take the mean of the feature vectors

Form the covariance matrix

Find the eigenvalues and eigenvectors of covariance matrix

We did all these in Lab 4 !

xxy iiwhere

Page 10: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 10

Shift cloud of points so that average position is at origin

Page 11: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 11

Principle components of cloud of points

Page 12: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 12

Another Example

Page 13: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 13

Shift cloud of points so that average position is at origin

Page 14: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 14

Principle components of cloud of points

Page 15: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 15

One problem with PCA when applied directly to the covariancematrix of the images is that the dimension of covariance matrixis huge! For example, for a 256x256 image, X will be of dimension65536x1. So the covariance matrix of X will be of size 65536x65536 !!! To compute the eigenvector of such a huge matrix is clearly inconceivable.

To solve this problem, Murakami and Kumar proposed a technique inthe following paper:

Murakami, H. and Vijaya Kumar, B.V.K., “Efficient Calculation of Primary Images from a Set of Images”, IEEE Trans. Pattern Analysisand Machine Intelligence, vol. PAMI-4, No. 5, Sep 1982.

We shall now look at this method. In the lab, you will practice implementing this algorithm for getting principle components from huge feature vectors. We will use a face recognition example in the lab.

Page 16: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 16

To begin, we note that our goal is to find the eigenvectors andeigenvalues of the covariance matrix

iTi

Tii yeyeyy )(

N

i

Tii

N

i

Tii NN

C11

1)()(

1yyxxxx

We will see later that instead of working directly on , which is a huge matrix, we can find the eigenvectors and eigenvalues of anNxN matrix and then use it to deduce the eigenvectors and eigenvalues of . C

C

To see how this can be done, we first note that

scalar

Page 17: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 17

Therefore, if e is an eigenvector of C, then

ey

eyey

eeyy

N

iii

N

ii

Ti

N

i

Tii

aN

N

N

1

1

1

1

)(1

1

LetN

ab ii

then

N

iiib

1

ye

So if is an eigenvector of , can be expressed as a

linear combination of the vectors .

C

iy

e e

Page 18: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 18

Therefore,

ij

N

jj

Ti

N

i

N

iii

N

jj

Tiji

N

i

N

iii

N

jjj

Tii

N

i

N

iii

Tii

bbN

bbN

bbN

bN

1

1 11

1 11

1 1

1

1

)(1

1

yy

yyyy

yyyy

yeyy

Nb

b

b

2

1

bTherefore, the vector is the eigenvector of the matrix

with (i,j) elements given by

jTiNyy

1

Page 19: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 19

In conclusion, the task of finding the eigenvectors of a huge matrixof size x (where M is the number of pixels in an image), is reduced to the task of finding the eigenvectors of a NxN matrix.

So the eigenvector of the matrix , i.e. e, is given by

N

i

TiiN 1

1yy

byyy

ye

N

N

iiib

21

1

Note that the eigenvalues of the matrix with elements is the same as the eigenvalues of the matrix

N

i

TiiN 1

1yy

jTiNyy

1

M M

Page 20: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 20

},,,,{ 321 Nxxxx

N

iNN 1

1xx

Suppose we have N feature vectors

We take the mean of the feature vectors

We want to find the principle components. Since a direct computation of covariance matrix is not feasible due to hugedimension of covariance matrix, we do the following steps instead:

Step 1:

Step 2: Subtract the mean from each of the feature vectors x ix

xxy ii

Summary

Page 21: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 21

NTN

TN

TN

NTTT

NTTT

M

yyyyyy

yyyyyy

yyyyyy

21

22212

12111

Form the inner product matrixStep 3:

Step 4: Find the eigenvectors and eigenvalues of M

bbM

Page 22: Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 1 Pattern Recognition Features Principle Components Analysis.

Dr. Ng Teck Khim CS4243 Computer Vision and Pattern Recognition Pattern Recognition Basics 22

Step 5: If and

are the principle components and the corresponding

eigenvalues respectively of

then

][ 321 Neeee N 321

Nxxxx 321

iNi byyyye 321

ii