Post on 02-Jan-2017
CS 556: Computer Vision
Lecture 8
Prof. Sinisa Todorovic
sinisa@eecs.oregonstate.edu
1
Outline
• Object Recognition
• Image Classification
2
Object Recognition and Image Classification
3
Car/non-car
Classifier
Feature
extraction
Training examples
How to Classify?
4
ximagefeature
class
discriminantfunction
{+1, �1}F (x, y)
y
Linear Classifier
5
ximagefeature
class
discriminantfunction
{+1, �1}F (x, y)
y
F (x, y) = sign(w.x + b)
Example Application: Pedestrian Detection
• Multiscale scanning windows
• For each window compute the wavelet transform
• Classify the window using SVM
6
“A general framework for object detection,” C. Papageorgiou, M. Oren and T. Poggio -- CVPR 98
Example Application: Pedestrian Detection
7
“A general framework for object detection,” C. Papageorgiou, M. Oren and T. Poggio -- CVPR 98
Support Vector Machine — MATLAB
8
SVMModel = fitcsvm(X,Y,’KernelFunction’,’rbf’,... 'Standardize',true,... 'ClassNames',{'negClass','posClass'});
[label,score] = predict(SVMModel,newX);
training:
classification:
SVM — MATLAB: Training
9
rng(1); % For reproducibility size_data1 = 100; r = sqrt(rand(size_data1,1)); % Radius t = 2*pi*rand(size_data1,1); % Angle data1 = [r.*cos(t), r.*sin(t)]; % Points
size_data2 = 100; r2 = sqrt(3*rand(size_data2,1)+1); % Radius t2 = 2*pi*rand(size_data2,1); % Angle data2 = [r2.*cos(t2), r2.*sin(t2)]; % Points
figure; plot(data1(:,1),data1(:,2),'r.','MarkerSize',15) hold on plot(data2(:,1),data2(:,2),'b.','MarkerSize',15) ezpolar(@(x)1);ezpolar(@(x)2); axis equal hold off
Specify training data:
SVM — MATLAB: Training
10
SVMModel = fitcsvm(X,Y,’KernelFunction’,’rbf’,... 'Standardize',true,... ‘ClassNames’,[-1,1]);
X = [data1;data2]; % Matrix, where each row is one feature size_data = size_data1 + size_data2; Y = ones(size_data,1); % Vector of classes for each feature Y(1:size_data1) = -1;
Specify training data:
Conduct training:
MATLAB centers and scales each column of the data Xby the weighted column mean and standard deviation, respectively
SVM — MATLAB: Prediction
11
XTest = ; % Matrix, where each row is one feature YTest = ; % Vector of ground-truth classes for each feature
Specify new data:
Classification:
[YTestHat,score] = predict(SVMModel,XTest);
confidence in classification = signed distance to the SVM’s decision boundary
sigmoid function
normalized confidencein the positive class “1”:
normalized confidencein the negative class “-1”:
Evaluation — MATLAB: Confusion Matrix
12
[C, order] = confusionmat(YTest,YTestHat)
ground truth labels predicted labelsconfusion matrix with rows = ground truthcolumns = prediction
g1 = [1 1 2 2 3 3]'; % Ground truth labels g2 = [1 1 2 3 4 NaN]'; % Predicted labels, NaN is not counted
[C,order] = confusionmat(g1,g2) C = 2 0 0 0 0 1 1 0 0 0 0 1 0 0 0 0 order = 1 2 3 4
Example:
Evaluation — MATLAB: Confusion Matrix (Alternative)
13
[c,cm,ind,per] = confusion(targets,outputs)
binary matrix ofground truth labels
binary matrix ofpredicted labelsrows = labels
columns = test features
Evaluation — MATLAB: Example
14
g1 = [1 1 2 2 3 3 3 4 3 1]'; % Ground truth labels for 10 features g2 = [1 1 2 3 4 2 3 4 2 4]'; % Predicted labels for 10 features
targets = zeros(4,10); % 4 labels and 10 features for i=1:10 targets(g1(i),i)=1; end
outputs = zeros(4,10); % 4 labels and 10 features for i=1:10 outputs(g2(i),i)=1; end
[c,cm,ind,per] = confusion(targets,outputs);
Evaluation — MATLAB: Example
15
>> cm % confusion matrix
cm =
2 0 0 1 0 1 1 0 0 2 1 1 0 0 0 1
>> per
per =
0.1250 0 1.0000 0.8750 0.1429 0.6667 0.3333 0.8571 0.3750 0.5000 0.5000 0.6250 0 0.6667 0.3333 1.0000
rows = ground truth
columns = predictions
Evaluation — Precision & Recall
16
Recall = TP / P
Precision = TP / (TP + FP)
F1 = 2 * Recall * Precision / (Recall + Precision)
Evaluation — MATLAB: Example — Precision & Recall
17
>> for i=1:4 Recall(i)=sum(outputs(i,:).*targets(i,:))/sum(targets(i,:)); end
>> Recall % recall per each class
Recall =
0.6667 0.5000 0.2500 1.0000
>> for i=1:4 Precision(i)=sum(outputs(i,:).*targets(i,:))/sum(outputs(i,:)); end
>> Precision % precision per each class
Precision =
1.0000 0.3333 0.5000 0.3333
>> F1= (2*Precision(:).*Recall(:)./(Precision(:)+Recall(:)))'
F1 =
0.8000 0.4000 0.3333 0.5000