EM Algorithm and its Applications

40
EM Algorithm and its Applications Yi Li Department of Computer Science and Engineering University of Washington

description

EM Algorithm and its Applications. Yi Li Department of Computer Science and Engineering University of Washington. Outline. Introduction of EM K-Means  EM EM Applications Image Segmentation using EM Object Class Recognition in CBIR. Color Clustering by K-means Algorithm. - PowerPoint PPT Presentation

Transcript of EM Algorithm and its Applications

Page 1: EM Algorithm and its Applications

EM Algorithm and its Applications

Yi LiDepartment of Computer Science and

EngineeringUniversity of Washington

Page 2: EM Algorithm and its Applications

Outline

Introduction of EMK-Means EM

EM Applications Image Segmentation using EM Object Class Recognition in CBIR

Page 3: EM Algorithm and its Applications

Color Clustering by K-means Algorithm

Form K-means clusters from a set of n-dimensional vectors

1. Set ic (iteration count) to 1

2. Choose randomly a set of K means m1(1), …, mK(1).

3. For each vector xi, compute D(xi,mk(ic)), k=1,…K and assign xi to the cluster Cj with nearest mean.

4. Increment ic by 1, update the means to get m1(ic),…,mK(ic).

5. Repeat steps 3 and 4 until Ck(ic) = Ck(ic+1) for all k.

Page 4: EM Algorithm and its Applications

Classifier(K-Means)

x1={r1, g1, b1}

x2={r2, g2, b2}

xi={ri, gi, bi}

Classification Resultsx1C(x1)x2C(x2)

…xiC(xi)

Cluster Parametersm1 for C1

m2 for C2

…mk for Ck

K-Means Classifier

Page 5: EM Algorithm and its Applications

x1={r1, g1, b1}

x2={r2, g2, b2}

xi={ri, gi, bi}

Classification Resultsx1C(x1)x2C(x2)

…xiC(xi)

Cluster Parametersm1 for C1

m2 for C2

…mk for Ck

Input (Known) Output (Unknown)

K-Means Classifier (Cont.)

Page 6: EM Algorithm and its Applications

x1={r1, g1, b1}

x2={r2, g2, b2}

xi={ri, gi, bi}

Classification Results (1)

C(x1), C(x2), …, C(xi)

Initial Guess of Cluster Parameters

m1 , m2 , …, mk

Input (Known) Output (Unknown)

Cluster Parameters(1)

m1 , m2 , …, mk Classification Results (2)

C(x1), C(x2), …, C(xi)Cluster Parameters(2)

m1 , m2 , …, mk

Classification Results (ic)

C(x1), C(x2), …, C(xi)Cluster Parameters(ic)

m1 , m2 , …, mk

Page 7: EM Algorithm and its Applications

K-Means (Cont.)

Boot Step: Initialize K clusters: C1, …, CK

Each Cluster is represented by its mean mj

Iteration Step: Estimate the cluster of each data

Re-estimate the cluster parameters

xi C(xi)

}|{ jiij Cxxmeanm

Page 8: EM Algorithm and its Applications

K-Means Example

Page 9: EM Algorithm and its Applications

K-Means Example

Page 10: EM Algorithm and its Applications

K-Means EM

Boot Step: Initialize K clusters: C1, …, CK

Iteration Step: Estimate the cluster of each data

Re-estimate the cluster parameters

(j, j) and P(Cj) for each cluster j.

)|( ij xCp

)(),,( jjj Cp For each cluster j

Expectation

Maximization

Page 11: EM Algorithm and its Applications

Classifier(EM)

x1={r1, g1, b1}

x2={r2, g2, b2}

xi={ri, gi, bi}

Classification Resultsp(C1|x1)p(Cj|x2)

…p(Cj|xi)

Cluster Parameters(1,1),p(C1) for C1

(2,2),p(C2) for C2

…(k,k),p(Ck) for Ck

EM Classifier

Page 12: EM Algorithm and its Applications

x1={r1, g1, b1}

x2={r2, g2, b2}

xi={ri, gi, bi}

Cluster Parameters(1,1), p(C1) for C1

(2,2), p(C2) for C2

…(k,k), p(Ck) for Ck

EM Classifier (Cont.)Input (Known) Output (Unknown)

Classification Resultsp(C1|x1)p(Cj|x2)

…p(Cj|xi)

Page 13: EM Algorithm and its Applications

x1={r1, g1, b1}

x2={r2, g2, b2}

xi={ri, gi, bi}

Cluster Parameters(1,1), p(C1) for C1

(2,2), p(C2) for C2

…(k,k), p(Ck) for Ck

Expectation StepInput (Known) Input (Estimation) Output

+

Classification Resultsp(C1|x1)p(Cj|x2)

…p(Cj|xi)

jjji

jji

i

jjiij CpCxp

CpCxp

xp

CpCxpxCp

)()|(

)()|(

)(

)()|()|(

Page 14: EM Algorithm and its Applications

x1={r1, g1, b1}

x2={r2, g2, b2}

xi={ri, gi, bi}

Cluster Parameters(1,1), p(C1) for C1

(2,2), p(C2) for C2

…(k,k), p(Ck) for Ck

Maximization Step

iij

Tjiji

iij

j xCp

xxxCp

)|(

)()()|(

Input (Known) Input (Estimation) Output

+

Classification Resultsp(C1|x1)p(Cj|x2)

…p(Cj|xi)

iij

ii

ij

j xCp

xxCp

)|(

)|(

N

xCpCp i

ij

j

)|()(

Page 15: EM Algorithm and its Applications

EM Algorithm Boot Step:

Initialize K clusters: C1, …, CK

Iteration Step: Expectation Step

Maximization Step

(j, j) and P(Cj) for each cluster j.

jjji

jji

i

jjiij CpCxp

CpCxp

xp

CpCxpxCp

)()|(

)()|(

)(

)()|()|(

iij

Tjiji

iij

j xCp

xxxCp

)|(

)()()|(

iij

ii

ij

j xCp

xxCp

)|(

)|(

N

xCpCp i

ij

j

)|()(

Page 16: EM Algorithm and its Applications

EM Demo

Demohttp://www.neurosci.aist.go.jp/~akaho/MixtureEM.html

Examplehttp://www-2.cs.cmu.edu/~awm/tutorials/gmm13.pdf

Page 17: EM Algorithm and its Applications

EM Applications

Blobworld: Image Segmentation Using Expectation-Maximization and its Application to Image Querying

Page 18: EM Algorithm and its Applications

Image Segmentation using EM

Step 1: Feature Extraction Step 2: Image Segmentation using

EM

Page 19: EM Algorithm and its Applications

Symbols

The feature vector for pixel i is called xi. There are going to be K segments; K is

given. The j-th segment has a Gaussian

distribution with parameters j=(j,j).

j's are the weights (which sum to 1) of Gaussians. is the collection of parameters: =(1, …, k, 1, …, k)

Page 20: EM Algorithm and its Applications

Initialization Each of the K Gaussians will have

parameters j=(j,j), where j is the mean of the j-th Gaussian. j is the covariance matrix of the j-th Gaussian.

The covariance matrices are initialed to be the identity matrix.

The means can be initialized by finding the average feature vectors in each of K windows in the image; this is data-driven initialization.

Page 21: EM Algorithm and its Applications

E-Step

K

kkikk

jijji

xf

xfxjp

1

)|(

)|(),|(

)()(2

1

2/12/

1

||)2(

1)|(

jjT

j xx

jdjj exf

Page 22: EM Algorithm and its Applications

M-Step

N

i

oldi

N

i

oldii

newj

xjp

xjpx

1

1

),|(

),|(

N

i

oldi

N

i

Tnewji

newji

oldi

newj

xjp

xxxjp

1

1

),|(

))()(,|(

N

i

oldi

newj xjp

N 1

),|(1

Page 23: EM Algorithm and its Applications

Sample Results

Page 24: EM Algorithm and its Applications

Object Class Recognition in CBIR

The Goal:Automatic image labeling (annotation) to enable object-based image retrieval

Page 25: EM Algorithm and its Applications

Known: Some images and their corresponding descriptions

{trees, grass, cherry trees}{cheetah, trunk} {mountains, sky}{beach, sky, trees, water}

? ? ? ?

To solve: What object classes are present in new images

Problem Statement

Page 26: EM Algorithm and its Applications

Abstract Regions

Original Images Color Regions Texture Regions Line Clusters

Page 27: EM Algorithm and its Applications

Boat, Water,Sky, …!

boat

building

Page 28: EM Algorithm and its Applications

Object Model Learning (Ideal)

skytree

water

boat

+

Sky

Tree

Water

Boat

region attributes object

Water =

Sky =

Tree =

Boat =

Learned Models

Page 29: EM Algorithm and its Applications

Object Model Learning (Real)

+

region attributes object

Water =

Sky =

Tree =

Boat =

Learned Models

{sky, tree, water, boat}

?

?

?

?

?

?

?

?

?

Page 30: EM Algorithm and its Applications

Model Initial Estimation Estimate the initial model of an object

using all the region features from all images that contain the object

Tree

Sky

Page 31: EM Algorithm and its Applications

Final Model for “trees”

Final Model for “sky”

EM

EM Variant

Initial Model for “trees”

Initial Model for “sky”

Page 32: EM Algorithm and its Applications

Object Model Learning

Assumptions The feature distribution of each

object within a region is a Gaussian;

Each image is a set of regions, each of which can be modeled as a mixture of multivariate Gaussian distributions.

Page 33: EM Algorithm and its Applications

1

(0)Oq 2

(0)Oq 3

(0)Oq

I1O1O2

O1O3

I2 I3O2O3

Image & description

1. Initialization Step (Example)

W=0.5 W=0.5W=0.5 W=0.5

W=0.5

W=0.5

W=0.5

W=0.5

W=0.5 W=0.5W=0.5 W=0.5

Page 34: EM Algorithm and its Applications

E-Step

M-Step

2. Iteration Step (Example)

1

( )pOq 2

( )pOq 3

( )pOq

1

( 1)pOq

+

2

( 1)pOq

+3

( 1)pOq

+

I1O1O2

O1O3

I2 I3O2O3

W=0.8 W=0.2W=0.2 W=0.8

W=0.8

W=0.2

W=0.2

W=0.8

W=0.8 W=0.2W=0.2 W=0.8

Page 35: EM Algorithm and its Applications

p( tree| )

p( tree| )

Image Labeling

Test Image Color Regions

Tree

Sky

compare

Object Model Database

To calculate p(tree | image)

p( tree| )

p(tree | image) = max )|(max)|( a

Fr

aI ropFop

aI

ap( tree| )

Page 36: EM Algorithm and its Applications

Experiments 860 images 18 keywords: mountains (30), orangutan (37), track

(40), tree trunk (43), football field (43), beach (45), prairie grass (53), cherry tree (53), snow (54), zebra (56), polar bear (56), lion (71), water (76), chimpanzee (79), cheetah (112), sky (259), grass (272), tree (361).

A set of cross-validation experiments (80% as the training set and the other 20% as the test set)

Page 37: EM Algorithm and its Applications

0

0.2

0.4

0.6

0.8

1

0 0.2 0.4 0.6 0.8 1

False Positive Rate

Tru

e P

osi

tive

Ra

te

0

0.2

0.4

0.6

0.8

1

0 0.2 0.4 0.6 0.8 1

False Positive Rate

Tru

e P

osi

tive

Ra

te

ROC Charts

Independent Treatment of Color and Texture

Using Intersection of Color and Texture

Page 38: EM Algorithm and its Applications

cheetah

Sample Results

Page 39: EM Algorithm and its Applications

Sample Results (Cont.)

grass

Page 40: EM Algorithm and its Applications

Sample Results (Cont.)

lion