FAA Airport Pavement Technology Program Outline Outline - ICAO
Outline
description
Transcript of Outline
![Page 1: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/1.jpg)
Outline ... Feature extraction Feature selection / Dim. reduction Classification ...
![Page 2: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/2.jpg)
Outline ... Feature extraction
shape texture
Feature selection / Dim. reduction Classification ...
![Page 3: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/3.jpg)
How to extract features? Eg.:
ER
Tubulin DNATfRActin
NucleolinMitoLAMP
gpp130giantin
![Page 4: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/4.jpg)
How to extract features? Eg.:
ER
Tubulin DNATfRActin
NucleolinMitoLAMP
gpp130giantinER
brightness(?)
area(?) Mit
![Page 5: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/5.jpg)
Images - shapes distance function: Euclidean, on the
area, perimeter, and 20 ‘moments’ [QBIC, ’95]
Q: other ‘features’ / distance functions?
![Page 6: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/6.jpg)
Images - shapes (A1: turning angle) A2: wavelets A3: morphology: dilations/erosions ...
![Page 7: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/7.jpg)
Wavelets - examplehttp://grail.cs.washington.edu/projects/query/Wavelets achieve *great* compression:
20 100 400 16,000
# coefficients
![Page 8: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/8.jpg)
Wavelets - intuition Edges (horizontal; vertical; diagonal)
![Page 9: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/9.jpg)
Wavelets - intuition Edges (horizontal; vertical; diagonal) recurse
![Page 10: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/10.jpg)
Wavelets - intuition Edges (horizontal; vertical; diagonal) http://www331.jpl.nasa.gov/public/
wave.html
![Page 11: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/11.jpg)
Wavelets Many wavelet basis:
Haar Daubechies (-4, -6, -20) Gabor ...
![Page 12: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/12.jpg)
Daubechies D4 decompsotion
Original image Wavelet Transformation
![Page 13: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/13.jpg)
Gabor Function
We can extend the function to generate Gabor filters by rotating and dilating
![Page 14: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/14.jpg)
Feature Calculation Preprocessing
Background subtraction and thresholding, Translation and rotation
Wavelet transformation The Daubechies 4 wavelet 10th level decomposition The average energy of the three high-frequency
components
adv
![Page 15: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/15.jpg)
Feature Calculation Preprocessing 30 Gabor filters were generated using five
different scales and six different orientations Convolve an input image with a Gabor filter Take the mean and standard deviation of the
convolved image 60 Gabor texture features
adv
![Page 16: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/16.jpg)
Wavelets: Extremely useful Excellent compression / feature
extraction, for natural images fast to compute ( O(N) )
![Page 17: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/17.jpg)
Images - shapes (A1: turning angle) A2: wavelets A3: morphology: dilations/erosions ...
![Page 18: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/18.jpg)
Other shape features Morphology (dilations, erosions,
openings, closings) [Korn+, VLDB96]
shape (B/W)“structuring
element”
R=1
![Page 19: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/19.jpg)
Other shape features Morphology (dilations, erosions,
openings, closings) [Korn+, VLDB96]
shape“structuring
element”
R=0.5
R=1
R=2
![Page 20: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/20.jpg)
Other shape features Morphology (dilations, erosions,
openings, closings) [Korn+, VLDB96]
shape“structuring
element”
R=0.5
R=1
R=2
![Page 21: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/21.jpg)
Morphology: closing fill in small gaps very similar to ‘alpha contours’
![Page 22: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/22.jpg)
Morphology: closing fill in small gaps
‘closing’, with R=1
![Page 23: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/23.jpg)
Morphology: opening ‘closing’, for the complement = trim small extremities
![Page 24: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/24.jpg)
Morphology: opening
‘opening’with R=1
‘closing’, for the complement = trim small extremities
![Page 25: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/25.jpg)
Morphology Closing: fills in gaps
Opening: trims extremities
All wrt a structuring element:
![Page 26: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/26.jpg)
Morphology Features: areas of openings (R=1, 2,
…) and closings
![Page 27: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/27.jpg)
Morphology resulting areas: ‘pattern spectrum’
translation ( and rotation) independent
As described: on b/w images can be extended to grayscale ones (eg., by
thresholding)
![Page 28: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/28.jpg)
Conclusions Shape: wavelets; math. morphology texture: wavelets; Haralick texture
features
![Page 29: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/29.jpg)
References Faloutsos, C., R. Barber, et al. (July 1994). “Efficient
and Effective Querying by Image Content.” J. of Intelligent Information Systems 3(3/4): 231-262.
Faloutsos, C. and K.-I. D. Lin (May 1995). FastMap: A Fast Algorithm for Indexing, Data-Mining and Visualization of Traditional and Multimedia Datasets. Proc. of ACM-SIGMOD, San Jose, CA.
Faloutsos, C., M. Ranganathan, et al. (May 25-27, 1994). Fast Subsequence Matching in Time-Series Databases. Proc. ACM SIGMOD, Minneapolis, MN.
![Page 30: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/30.jpg)
References Christos Faloutsos, Searching Multimedia Databases
by Content, Kluwer 1996
![Page 31: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/31.jpg)
References Flickner, M., H. Sawhney, et al. (Sept. 1995). “Query
by Image and Video Content: The QBIC System.” IEEE Computer 28(9): 23-32.
Goldin, D. Q. and P. C. Kanellakis (Sept. 19-22, 1995). On Similarity Queries for Time-Series Data: Constraint Specification and Implementation (CP95),
Cassis, France.
![Page 32: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/32.jpg)
References Charles E. Jacobs, Adam Finkelstein, and David H.
Salesin. Fast Multiresolution Image Querying SIGGRAPH '95, pages 277-286. ACM, New York, 1995.
Flip Korn, Nikolaos Sidiropoulos, Christos Faloutsos, Eliot Siegel, Zenon Protopapas: Fast Nearest Neighbor Search in Medical Image Databases. VLDB 1996: 215-226
![Page 33: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/33.jpg)
Outline ... Feature extraction Feature selection / Dim. reduction Classification ...
![Page 34: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/34.jpg)
Outline ... Feature selection / Dim. reduction
PCA ICA Fractal Dim. reduction variants
...
![Page 35: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/35.jpg)
Feature Reduction Remove non-discriminative features Remove redundant features Benefits :
Speed Accuracy Multimedia indexing
![Page 36: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/36.jpg)
SVD - Motivation
area
brig
htne
ss
![Page 37: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/37.jpg)
SVD - Motivation
area
brig
htne
ss
![Page 38: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/38.jpg)
SVD – dim. reduction
minimum RMS error
SVD: givesbest axis to project
v1
area
brig
htne
ss
![Page 39: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/39.jpg)
SVD - Definition A = U VT - example:
v1
![Page 40: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/40.jpg)
SVD - PropertiesTHEOREM [Press+92]: always possible to
decompose matrix A into A = U VT , where U, V: unique (*) U, V: column orthonormal (ie., columns are
unit vectors, orthogonal to each other) UT U = I; VT V = I (I: identity matrix)
: eigenvalues are positive, and sorted in decreasing order
math
![Page 41: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/41.jpg)
Outline ... Feature selection / Dim. reduction
PCA ICA Fractal Dim. reduction variants
...
![Page 42: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/42.jpg)
ICA Independent Component Analysis
better than PCA also known as ‘blind source separation’ (the `cocktail discussion’ problem)
![Page 43: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/43.jpg)
Intuition behind ICA:
“Zernike moment #1”
“Zernike moment #2”
![Page 44: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/44.jpg)
Motivating Application 2:Data analysis
“Zernike moment #1”
“Zernike moment #2”PCA
![Page 45: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/45.jpg)
Motivating Application 2:Data analysis
“Zernike moment #1”
“Zernike moment #2”ICA
PCA
![Page 46: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/46.jpg)
Conclusions for ICA Better than PCA Actually, uses PCA as a first step!
![Page 47: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/47.jpg)
Outline ... Feature selection / Dim. reduction
PCA ICA Fractal Dim. reduction variants
...
![Page 48: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/48.jpg)
Fractal Dimensionality Reduction
1. Calculate the fractal dimensionality of the training data.
2. Forward-Backward select features according to their impact on the fractal dimensionality of the whole data.
adv
![Page 49: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/49.jpg)
Dim. reduction
Spot and drop attributes with strong (non-)linear correlationsQ: how do we do that?
adv
![Page 50: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/50.jpg)
Dim. reduction - w/ fractals
y
xxx
yy(a) Quarter-circle (c) Spike(b)Line1
0000
1
not informative
adv
![Page 51: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/51.jpg)
Dim. reduction
Spot and drop attributes with strong (non-)linear correlationsQ: how do we do that?A: compute the intrinsic (‘ fractal ’)
dimensionality ~ degrees-of-freedom
![Page 52: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/52.jpg)
Dim. reduction - w/ fractals
y
xxx
yy(a) Quarter-circle (c) Spike(b)Line1
0000
1
PFD~0
PFD=1
global FD=1
adv
![Page 53: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/53.jpg)
Dim. reduction - w/ fractals
y
xxx
yy(a) Quarter-circle (c) Spike(b)Line1
0000
1
PFD=1
PFD=1
global FD=1
adv
![Page 54: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/54.jpg)
Dim. reduction - w/ fractals
y
xxx
yy(a) Quarter-circle (c) Spike(b)Line1
0000
1
PFD~1
PFD~1global FD=1
adv
![Page 55: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/55.jpg)
Outline ... Feature selection / Dim. reduction
PCA ICA Fractal Dim. reduction variants
...
![Page 56: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/56.jpg)
Nonlinear PCA
x
y
adv
![Page 57: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/57.jpg)
Nonlinear PCA
x
y
adv
![Page 58: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/58.jpg)
Nonlinear PCA
Xnm is the original data matrix, n points, m dimensions
x
y
)( mnmn XFX
adv
![Page 59: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/59.jpg)
Kernel PCA
K(x i,x j ) (x i),(x j )
Kernel Function
K(x i,x j ) exp x i x j
2
2 2
y’
x’
x
y
adv
![Page 60: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/60.jpg)
Genetic Algorithm
1 1 0 0 1 0 0 1 1 0 0 0 … 0 1 1 0 0 0 0 1 1 0 1 0 …Generation 1.1 Generation 1.2
1 0 0 0 1 0 0 1 1 1 0 0 …Generation 2.1
0 1 1 0 0 1 0 1 1 0 1 0 …Generation 2.2
Mutation
1 0 0 0 1 0 0 1 1 0 1 0 …Generation 3.1 Generation 3.2
0 1 1 0 0 1 0 1 1 1 0 0 …
Crossover
Evaluation Function (Classifier)
adv
![Page 61: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/61.jpg)
Stepwise Discriminant Analysis
1. Calculate Wilk’s lambda and its corresponding F-statistic of the training data.
2. Forward-Backward selecting features according to the F-statistics.
(m) W (X )
T (X ),X [X1,X2 , ,Xm ]
Fto enter (n q mq 1
)(1 (m 1)
(m 1))
adv
![Page 62: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/62.jpg)
References Berry, Michael: http://www.cs.utk.edu/~lsi/ Duda, R. O. and P. E. Hart (1973). Pattern
Classification and Scene Analysis. New York, Wiley.
Faloutsos, C. (1996). Searching Multimedia Databases by Content, Kluwer Academic Inc.
Foltz, P. W. and S. T. Dumais (Dec. 1992). "Personalized Information Delivery: An Analysis of Information Filtering Methods." Comm. of ACM (CACM) 35(12): 51-60.
![Page 63: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/63.jpg)
References Fukunaga, K. (1990). Introduction to
Statistical Pattern Recognition, Academic Press.
Jolliffe, I. T. (1986). Principal Component Analysis, Springer Verlag.
Aapo Hyvarinen, Juha Karhunen, and Erkki Oja Independent Component Analysis, John Wiley & Sons, 2001.
![Page 64: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/64.jpg)
References Korn, F., A. Labrinidis, et al. (2000).
"Quantifiable Data Mining Using Ratio Rules." VLDB Journal 8(3-4): 254-266.
Jia-Yu Pan, Hiroyuki Kitagawa, Christos Faloutsos, and Masafumi Hamamoto. AutoSplit: Fast and Scalable Discovery of Hidden Variables in Stream and Multimedia Databases. PAKDD 2004
![Page 65: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/65.jpg)
References Press, W. H., S. A. Teukolsky, et al. (1992).
Numerical Recipes in C, Cambridge University Press.
Strang, G. (1980). Linear Algebra and Its Applications, Academic Press.
Caetano Traina Jr., Agma Traina, Leejay Wu and Christos Faloutsos, Fast feature selection using the fractal dimension, XV Brazilian Symposium on Databases (SBBD), Paraiba, Brazil, October 2000
![Page 66: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/66.jpg)
![Page 67: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/67.jpg)
Outline ... Feature extraction Feature selection / Dim. reduction Classification ...
![Page 68: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/68.jpg)
Outline ... Feature extraction Feature selection / Dim. reduction Classification
classification trees support vector machines mixture of experts
![Page 69: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/69.jpg)
NucleolarMitoch. Actin
TubulinEndosomal ???
![Page 70: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/70.jpg)
-+
???
![Page 71: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/71.jpg)
Decision trees - Problem
??
![Page 72: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/72.jpg)
Decision trees Pictorially, we have
num. attr#1 (e.g.., ‘area’)
num. attr#2(e.g.., brightness)
+
-++ +
+
+
+
-
--
--
![Page 73: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/73.jpg)
Decision trees and we want to label ‘?’
num. attr#1 (e.g.., ‘area’)
num. attr#2(e.g.., brightness)
+
-++ +
+
+
+
-
--
--
?
![Page 74: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/74.jpg)
Decision trees so we build a decision tree:
num. attr#1 (e.g.., ‘area’)
num. attr#2(e.g.., brightness)
+
-++ +
+
+
+
-
--
--
?
50
40
![Page 75: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/75.jpg)
Decision trees so we build a decision tree:
area<50
Y
+bright. <40
N
-...
Y N
‘area’
bright.
+
-++ +
++
+
-
--
--
?
50
40
![Page 76: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/76.jpg)
Decision trees Goal: split address space in (almost)
homogeneous regionsarea<50
Y
+bright. <40
N
-...
Y N
‘area’
bright.
+
-++ +
++
+
-
--
--
?
50
40
![Page 77: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/77.jpg)
Details - Variations Pruning
to avoid over-fitting AdaBoost
(re-train, on the samples that the first classifier failed)
Bagging draw k samples (with replacement); train k
classifiers; majority vote
adv
![Page 78: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/78.jpg)
AdaBoost It creates new and improved base classifiers on its
way of training by manipulating the training dataset. At each iteration it feeds the base classifier with a
different distribution of the data to focus the base
classifier on hard examples.
Weighted sum of all base classifiers.
Dt1 iDt iexp ty iht x i
Zt
T
ttt xhsignxH
1
adv
![Page 79: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/79.jpg)
Bagging Use another strategy to manipulate the
training data: Bootstrap resampling with replacement.
63.2% of the total original training examples are retained in each bootstrapped set.
Good for training unstable base classifiers such as neural network and decision tree.
Weighted sum of all base classifiers.
adv
![Page 80: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/80.jpg)
Conclusions -Practitioner’s guide:
Many available implementations e.g., C4.5 (freeware), C5.0 Also, inside larger stat. packages
Advanced ideas: boosting, bagging Recent, scalable methods
see [Mitchell] or [Han+Kamber] for details
![Page 81: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/81.jpg)
References Tom Mitchell, Machine Learning, McGraw
Hill, 1997. Jiawei Han and Micheline Kamber, Data
Mining: Concepts and Techniques, Morgan Kaufmann, 2000.
![Page 82: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/82.jpg)
Outline ... Feature extraction Feature selection / Dim. reduction Classification
classification trees support vector machines mixture of experts
![Page 83: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/83.jpg)
Problem: Classification we want to label ‘?’
num. attr#1 (e.g.., area)
num. attr#2(e.g.., bright.)
+
-++ +
+
+
+
-
--
--
?
![Page 84: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/84.jpg)
Support Vector Machines (SVMs) we want to label ‘?’ - linear separator??
area
bright.
+
-
+
+
+
+
-
--
--
?
![Page 85: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/85.jpg)
Support Vector Machines (SVMs) we want to label ‘?’ - linear separator??
area
bright.
+
-
+
+
+
+
-
--
--
?
![Page 86: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/86.jpg)
Support Vector Machines (SVMs) we want to label ‘?’ - linear separator??
area
bright.
+
-
+
+
+
+
-
--
--
?
![Page 87: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/87.jpg)
Support Vector Machines (SVMs) we want to label ‘?’ - linear separator??
+
-
+
+
+
+
-
--
--
?
area
bright.
![Page 88: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/88.jpg)
Support Vector Machines (SVMs) we want to label ‘?’ - linear separator??
+
-
+
+
+
+
-
--
--
?
area
bright.
![Page 89: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/89.jpg)
Support Vector Machines (SVMs) we want to label ‘?’ - linear separator?? A: the one with the widest corridor!
area
bright.
+
-+
++
+
-
--
--
?
![Page 90: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/90.jpg)
Support Vector Machines (SVMs) we want to label ‘?’ - linear separator?? A: the one with the widest corridor!
area
bright.
+
-+
++
+
-
--
--
?
‘support vectors’
![Page 91: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/91.jpg)
Support Vector Machines (SVMs) Q: what if + and - are not separable? A: penalize mis-classifications
area
bright.
+
-+
++
+
-
--
-- +
![Page 92: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/92.jpg)
Support Vector Machines (SVMs) Q: how about non-linear separators? A:
area
bright.
+
-+
++
+
-
--
-- +
adv
![Page 93: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/93.jpg)
Support Vector Machines (SVMs) Q: how about non-linear separators? A: possible (but need human)
area
bright.
+
-+
++
+
-
--
-- +
adv
![Page 94: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/94.jpg)
Performance training:
O(Ns^3 + Ns^2 L + Ns L d ) to O(d * L^2 )
where Ns : # of support vectors L : size of training set d : dimensionality
adv
![Page 95: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/95.jpg)
Performance classification
O( M Ns )
where Ns : # of support vectors M: # of operations to compute similarity (~
inner product ~ ‘kernel’)
adv
![Page 96: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/96.jpg)
References C.J.C. Burges: A Tutorial on Support Vector
Machines for Pattern Recognition, Data Mining and Knowedge Discovery 2, 121-167, 1998
Nello Cristianini and John Shawe-Taylor. An Introduction to Support Vector Machines. Cambridge University Press, Cambridge, UK, 2000.
software: http://svmlight.joachims.org/ http://www.kernel-machines.org/
![Page 97: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/97.jpg)
Outline ... Feature extraction Feature selection / Dim. reduction Classification
classification trees support vector machines mixture of experts
![Page 98: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/98.jpg)
Mixture of experts Train several classifiers use a (weighted) majority vote scheme
![Page 99: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/99.jpg)
Conclusions: 6 powerful tools: shape & texture features:
wavelets mathematical morphology
Dim. reduction: SVD/PCA ICA
Classification: decision trees SVMs
![Page 100: Outline](https://reader035.fdocuments.us/reader035/viewer/2022070405/56813f5b550346895daa28cd/html5/thumbnails/100.jpg)