Research Results & Plans

52
Research Results & Plans Nojun Kwak [email protected]

description

Research Results & Plans. Nojun Kwak [email protected]. Research Areas: Overview. Pattern Recognition Feature selection / extraction algorithms Face recognition / Image analysis Neural networks Machine learning Data Mining Robotics Control of robot manipulators - PowerPoint PPT Presentation

Transcript of Research Results & Plans

Page 1: Research Results & Plans

Research Results & Plans

Nojun [email protected]

Page 2: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 2

Research Areas: Overview Pattern Recognition

Feature selection / extraction algorithms Face recognition / Image analysis Neural networks Machine learning Data Mining

Robotics Control of robot manipulators Implementation of Biped Robot and Control

Home Network Middleware UPnP / OSGi / IEEE1394 device driver Home Robot

Wireless communication protocol WCDMA protocol LTE (long term evolution) system

Page 3: Research Results & Plans

Feature selection & extraction

Page 4: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 4

Issues in Pattern Recognition

Collect data

Choose features

Choose model

Train system

Evaluate system

What kind of sensors should we use?How to collect data?

How do we know what features to select,and how do we select them …?(Transforms, PCA, LDA, ICA … )

What type of classifier / neural networkshall we use? Is there best classifier …?

How do we train?

How do we evaluate our performance?Validate the results? Confidence in decision?

from Robi Polikar’s tutorial at ICANN2006

Page 5: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 5

Feature Selection & Extraction

Solve a pattern recog. problem with N features. Is it possible to do the same job with fewer (M<N) inpu

t features? Feature Selection & Extraction Advantages of feature selection & extraction

Simpler structure of PR (classifier/regression) system Fewer data for training Shorter time of training (possible) Better performance - Occam’s razor

Feature Selection/Extraction

Feature Selection/Extraction

New Dataset M features

New Dataset M features

Original DatasetN features

Original DatasetN features

Page 6: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 6

Various criteria on goodness of features

Comparing PR (classification) performance Slow, Slow, Slow …

Correlation (with target/class) How to cope with non-linearity? (e.g. f = t^2)

Mutual Information (Advantages over others)

Maximize the mutual information between target/class and the feature I(C;F).

Page 7: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 7

Methods Feature Selection TMFS, MIFS-U, PWFS Feature Extraction ICA-FX, PWFX

TMFS (Taguchi Method for Feature Selection) MIFS-U (Mutual Information Feature Selector under Un

iform Information Assumption) PWFS (Parzen Window Feature Selector) ICA-FX (Feature Extraction with Independent Compone

nt Analysis) PWFX (Parzen Window Feature Extractor)

Showed good performances

Page 8: Research Results & Plans

Feature selection methods

Page 9: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 9

TMFS: Main idea

Taguchi method (Orthogonal Arrays) Measure = actual performance of a classifier. Design of Experiment (DoE): reduces the no. of experim

ents

Page 10: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 10

MIFS-U: Main idea

Ideal Greedy Maximizes

area 2+3+4

= I(f,s;C)

Area 2+4 is

common for

all features

Maximizes

area 3 = I(f;C|s)

H(f)

H(s)

H(C)

I(f;s)

I(f;C)

I(s;C)

1

2

4 3

Page 11: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 11

PWFS: Main idea

Parzen Window: Approximation of PDF with finite number of samples.

Numerical method for computing I(C;F).

f

p(f)

x x xxx xx

Page 12: Research Results & Plans

Feature extraction methods

Page 13: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 13

Previous Works Linear subspace methods

PCA (Principal Component Analysis) ICA (Independent Component Analysis) LDA (Linear Discriminant Analysis)

Nonlinear methods Kernel methods (K-PCA, K-LDA) Hidden neurons of MLP SVM (Support Vector Machine)

Statistics

LearningSecond order Higher order

Unsupervised PCA ICA

Supervised LDA (FLD) .

PWFX, ICA-FX

Page 14: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 14

Almost the same as PWFS Maximize mutual information between new feature

f = wx and class C. argmax I(wx;C) Direct calculation of mutual information. Mutual information is approximated by Parzen window

density estimation.

Stochastic gradient ascent in finding w.

PWFX: Main idea

w

Page 15: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 15

ICA-FX: ICA preliminary (I)

Purpose From the observations, estimate the sources.

Assumptions Sources are mutually independent. Observations X are linear combinations of S. Mixing matrix A is invertible. find W = A

Sources S (S1,…,SN)Sources

S (S1,…,SN)Observations X (X1,…,XN)

Observations X (X1,…,XN)

EstimatesU (U1,…UN)EstimatesU (U1,…UN)A W

X = ASX = AS

-1

How to U=S?

Page 16: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 16

ICA-FX: ICA preliminary (II)

Structure of ICA learning gi(.) : assumed to be cumulative density of ui

W g1( )x1

x2

x3

xN

u1

u2

u3

uN

y1

y2

y3

yN

g2( )

g3( )

gN( )

Page 17: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 17

ICA-FX: Main idea (I)

ICA shows good performance in finding independent components utilizing higher order statistics.

However, it does not fit to supervised learning because it does not utilize class/target information.

Include class/target information in the learning

(make use of the characteristics of supervised learning)

Include target information as an input for ICAInclude target information as an input for ICA

Page 18: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 18

ICA-FX: Main idea (II) Feature Extraction Problem for classification (FX prob

lem) : N zero-mean normalized input features : class information, Nc – no. of classes Find new features from x contai

ning maximal information about the class c.

Equality holds if fb is independent of c. Interpret FX problem in the structure of ICA.

Find that maximize , where

Page 19: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 19

ICA-FX: Structure (I)

is a feature vector among which will be used as important features.

If fb is independent of c.

Normal ICA

= [ ]

Page 20: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 20

ICA-FX: Structure (II)

fa = [f1, … , fM]

fb = [fM+1, … , fN] = [uM+1, … , uN]c = [uN+1, … , uN+Nc]: independent of fb

Indep.

fb

c

fa max. info on c

Page 21: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 21

ICA-FX: Learning algorithm (ML)

Natural Gradient

Natural Gradient

Log likelihood

Gradient ascent

Page 22: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 22

ICA-FX: Results (I)

f2

f3

f1

Training Image

New Image

JohnTomd1 d3

d2

d1 < min (d2,d3) good features

Procedures

Classification 1-NN

Page 23: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 23

ICA-FX: Results (II)

Yale data (165 images, 15 persons)

AT&T data (400 images, 40 persons)

Page 24: Research Results & Plans

Surface Defect Detect (POSCO)

Page 25: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 25

Data (Surface Defect)Scratch

Dirt

Defect

Type

Training Test Total

Scratch 46 76 122

Dull mark 16 14 30

Roll mark 4 8 12

Scale 8 9 17

Dirt 28 24 52

Slip mark 18 19 37

Total 119 150 269

Page 26: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 26

Procedure (Wavelet) & ResultsAdaptiveWaveletPacket

Preprocessing(LPF &

Segmentation)

NeuralNetworks

(MLP/RBF)

Feature Extraction

Classification

Surface Image

- hr: row entropy- hc: column entropy- h: total entropy- ratio: hr/hc

Page 27: Research Results & Plans

Rail-car Inspection (UIUC)

Page 28: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 28

Rail-car Inspection: Methods

Template matching technique Segmentation Edge detection Template matching

Thresholding

Page 29: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 29

Rail-car Inspection: Results

Page 30: Research Results & Plans

Control of Robot Manipulators

Page 31: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 31

Motivation

Characteristics of robot system Dynamics: Nonlinear, modeling uncertainties Hard to get exact dynamics.

Conventional control methods Neural networks

slow in parameter updating Robust control (feedback-linearization approach)

strict bounds on modeling error and disturbance

Fast but robust to poor parameter estimation Fast but robust to poor parameter estimation MBDA (Model Based Disturbance Attenuation)MBDA (Model Based Disturbance Attenuation)

)(),()( qgqqqCqqM

Page 32: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 32

Structure of MBDA

PK

M

+ +

K1

K2

+

+

+

+

+

+- -

-+

q~dq

0q

q

0

dqqq ,, 0 : Position vector of plant, model, and desired position

0, : Input torque for plant and model

1KK, : PD gains,

M P, : Plant and Model

2K : D gain

Page 33: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 33

Performance on Real Biped Robot

Initial & final positions

Got almost the same results as simulation.

Page 34: Research Results & Plans

Biped Robot: Simulation & Implementation

Page 35: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 35

Biped Robot Presented at the 1st Brain Science Conference.

(http://bsrc.kaist.ac.kr/braintech/image/reports/1-year/HG0204A02/HG0204A02-02.html)

To enable stable walking of a biped robot, target trajectory is needed

Simulation modeling (Trajectory generation)

Page 36: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 36

Robot Model Simulator

12 DoF (degree of freedom) 2 in ankle 1 in knee 3 in hip

Trajectory generation

& 3-D simulation

(OpenGL)

Trunk

Hip

Knee

Ankle

Page 37: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 37

Page 38: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 38

Implementation of Biped Robots

Supervised senior graduation project (1st Semester, 2000) Structure : leg robot

     - Height : 25cm

DOF : 10 DOF

     - Ankle : 2 (*2)

     - Knee : 1 (*2)

     - Hip : 2 (*2)

Actuator : Servo motor (HS-615)

Motor controller : 80c196kc (*2)

Motion : PC, serial interface

Walking : static walking

Page 39: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 39

Brain Science Project (2000~2001) Structure : leg robot

     - Height : 50cm DOF : 10 DOF

     - Ankle : 2 (*2)     - Knee : 1 (*2)     - Hip : 2 (*2)

Actuator : DC-micro motor (minimotor 2224)

Motor controller : PC-based control Interface : PCI-8136 board PC OS : QNX Controller programming : PICARD Walking : static walking

Page 40: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 40

Page 41: Research Results & Plans

Works in Samsung

Page 42: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 42

MagicGate: Linux based Home Gateway

MG = Set-top box + Internet Gateway + Home server Development of Linux 1394 device driver Development of UPnP bundle for OSGi (Java)

Home Robot (Unified remocon + Surveillance camera)

UnifiedRemocon

(Home Robot)

TV1

PC

MG2

CAM1394

Serial/LANPLC

ServerAV cable

light

Gas Valve

Windows

etcxDSL/E-netWLAN

TV2

E-net /WLAN

Terrestrial/Cable/Satellite

VOD server

EOD server

Portal serv.

E-Mail serv.

ONU

FTTH

WLAN

Internet

MG1(PVR/DVD)

OLT

Switch

CAM

Page 43: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 43

3GPP (WCDMA) Standardization

3GPP (3rd Generation Partnership Project) Standardization body for WCDMA (European Wireless Com. Network)

2004.4. ~ 2006. 8: Served as a 3GPP RAN WG3 main delegate for Samsung

Main research area MBMS (Multimedia Broadcast & Multicast Service) EUDCH (Enhanced Uplink Dedicated Channel) LTE (Long-Term Evolution)

Output +60 Submitted Patents +30 Contributions

RNS

RNC

RNS

RNC

Core Network

Node B Node B Node B Node B

Iu Iu

Iur

Iub Iub Iub Iub

UTRAN

Page 44: Research Results & Plans

Current Works (2006~2007)

Page 45: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 45

L1- Biased Discriminant Analysis BDA is better suited for detection and verification

problems. Positive / Negative problem Gaussian distribution assumption only on positive

examples. Has just developed L1-BDA.

Utilizes L1 measure instead of L2 measure. Robust to outliers.

Applications: image registration, face detection …

ETH80 database

Page 46: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 46

Eye detection & localization Basic tool for face detection in arbitrary ima

ges Can also be utilized for pose estimation (e.

g. with nose or mouse detection) Real-time automatic eye detection system b

y using Haar-like features

7 level Image pyramid(on FERET database)

Positive examples

Negative examples

Extracted features (AdaBoost)

Page 47: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 47

Pose estimation

Previous works: illumination compensation

Excerpt from “Shadow compensation in 2D images for face recognition”, Choi et. al, Pattern Recognition, accepted for publication 2007.

Future works: Pose estimation in combination with illumination compensation better classification performances expected

Page 48: Research Results & Plans

Future Works (2007 ~ )

Page 49: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 49

Enhancement to Linear Feature Extractors

Weighted samples Each samples are treated differently. E.g. by using posterior class probability

Suppression of outliers E.g. by adapting shaping functions for target

Ensemble feature extractor Via Bagging / Boosting techniques

Different assumption of input distribution on different classes

Will be started soon …

Page 50: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 50

(Full) Face Recognition System (I)

Face Detection

Feature Extraction

Classification

Face Recognition

Page 51: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 51

(Full) Face Recognition System (II)

Face Detection

Face Recognition

Classification Down Sample

PCA

X1

ICA-FXX2

XN

f1

f2

fM

1-NN

Image Acquisition

DCTWavelet

PWFX

HMM / CRF

Mahalanobis Distance

RBF / MLP

Page 52: Research Results & Plans

Nov. 15, 2006 Nojun Kwak 52

Intelligent Robot System

Collect data

Choose features

Choose model

Train system

Evaluate system

Various inputs from heterogeneous sensors & cameras.Communication with other elec. devices.

Select or extract appropriate features for decision.(MIFS-U, PWFS, PWFX, ICA-FX …)

Decision system: any classifier/regressor(MLP, RBF, Decision tree …)

Expected effect: - Surveillance via robot - Secured safety (reduced accident rates) - Reduced price for life (auto. home)