Dynamic and static hand gesture recognition in computer vision

Post on 05-Jan-2016

52 views 6 download

Tags:

description

Dynamic and static hand gesture recognition in computer vision. Andrzej Czyżewski, Bożena Kostek, Piotr Odya , Bartosz Kunka , Michał Lech Gdansk University of Technology, Faculty of Electronics, Telecommunications and Informatics Multimedia Systems Dept. Warsaw, 13.08.2014. - PowerPoint PPT Presentation

Transcript of Dynamic and static hand gesture recognition in computer vision

Dynamic and static hand gesture recognition in

computer vision

Andrzej Czyżewski, Bożena Kostek, Piotr Odya, Bartosz Kunka, Michał Lech

Gdansk University of Technology,Faculty of Electronics, Telecommunications and InformaticsMultimedia Systems Dept.

Warsaw, 13.08.2014

Presentation outline

1. Developed gesture recognition system

2. Background / foreground segmentation

3. Recognizing dynamic hand gestures

4. Recognizing static hand gestures

5. Efficiency

6. Video presentations

Presentation outline

1.1. Developed gesture recognition systemDeveloped gesture recognition system

2. Background / foreground segmentation

3. Recognizing dynamic hand gestures

4. Recognizing static hand gestures

5. Efficiency

6. Video presentations

Developed Gesture recognition system (1)• Features of the gesture recognition system

• Recognizing static (palm shape) and dynamic gestures (motion trajectory) of one or both hands

• The same dynamic gesture can have various meanings depending on the static gesture

• No datagloves, accelerometers or infrared emitters / sensors are needed

• System components• PC• Webcam (RGB)• Multimedia projector• Screen for projected image

• A user stands between a projection screen and the multimedia projector

Developed Gesture recognition system (2)

• Gesture dictionary

1 8

2 9

3 10

4 11

5 12

6 13

7 14

Developed Gesture recognition system (3)

• System working with the developed applications

• Virtual Whiteboard application• alternative solution to electronic

whiteboards

• Gesture-based sound mixing system• new method of sound mixing

immersing an engineer more in the sound

Developed Gesture recognition system (4)

Presentation outline

1. Developed gesture recognition system

2.2. Background / foreground segmentationBackground / foreground segmentation

3. Recognizing dynamic hand gestures

4. Recognizing static hand gestures

5. Efficiency

6. Video presentations

Background / foreground segmentation (1)• Most crucial part in RGB vision based systems

considering gesture recognition efficacy• influences representation of a hand shape in the

image• influences the degree of noise in the image –

false positive detections

• Two possible scenarios regarding camera placement

• front-faced camera placement

• back-faced camera placement (environment employing multimedia projector)

Background / foreground segmentation (2)

• Front-faced camera placement• Varying background behind a user• User free movements• Influence of lighting changes

Background / foreground segmentation (3)

• Back-faced camera placement• User not visible directly in the image• Background is relatively stable• Influence of lighting changes• Distortions in the image introduced by:

• Camera and projector lens• Impact of lighting on displayed image color

Background / foreground segmentation (4)

• The simplest background subtraction• Principle

• calculating a reference (background) image• subtracting each new frame from the reference image• thresholding the difference

• Difference image is noisy and very susceptible to lighting changes

• More practical approach• to calculate a time-averaged image

Background / foreground segmentation (5)

yxIyxMyxM tt ,,1, 1

• Background modelling• Considering background changes and adaptation• Typical methods:

• Codebook• Including periodical changes in the model• No adaptation

• GMM• Adaptation to background changes

• Skin color modelling• Relatively independent of background changes• Unreliable when background color is similar to skin color• Influence of lighting on skin color

Background / foreground segmentation (6)

• Background / foreground segmentation in the developed gesture recognition system (camera – projector configuration)

• The principle involves absolute subtracting the original image displayed by the multimedia projector from the processed image captured by the camera

Background / foreground segmentation (7)

Processed camera frame Displayed image Resulting image

a) b) c)

d) e) f)

a) perspective corrected camera image; b) e) image displayed by the projector; c) difference of a and b after converting to gray scale, thresholding and median filtering; d) perspective corrected and color calibrated camera image; f) difference of d and e after converting to gray scale, thresholding and median filtering;

Background / foreground segmentation (8)

Camera image Perspective corrected image

Color calibrated cropped image

Image displayed by the projector

Absolute difference result

Image after conversion to

gray scale

Binary thresholded

image

Median filtered image

Background / foreground segmentation (9)

Presentation outline

1. Developed gesture recognition system

2. Background / foreground segmentation

3.3. Recognizing dynamic hand gesturesRecognizing dynamic hand gestures

4. Recognizing static hand gestures

5. Efficiency

6. Video presentations

Recognizing dynamic hand gestures (1)• Motion modelling based on 2 succesive motion vectors• The singular motion vector is designated on points

localizing hand in frames n and n + c (c is a function of frame rate and for 22 FPS equals 3)

1101

21

21

s

px

ititiyiyixix

ij

0,

0,360 ij

xu

ijxu

ij

ijij

iju

ijyu

ija

ij

cos180

y

x

10

21

(x1, y1)

(x2, y2)

(x0, y0)

(x3, y3)

(x4, y4)

21u

10u

North East

East

43u

32u

y’

x’

• The velocity and direction of the motion is analysed using fuzzy-rule based system• 8 linguistic variables:• The inference zero-order Sugeno model with singletons denoting

gesture classes is suitable for dynamic gesture recognition• 30 fuzzy rules

• Exemplary rule:

// beginning phase of hand movement in the left direction (for semi-circular motion) for left hand

RULE 1 : IF directionLt0 IS north AND directionLt1 IS west AND velocityLt0 IS NOT small AND velocityLt1 IS NOT small AND velocityRt0 IS vsmall AND velocityRt1 IS vsmall THEN gesture IS g1;

RLRLRLRL dddd 1010212110102121 ,,,,,,,

Recognizing dynamic hand gestures (2)

• The outputs of fuzzy rules are filtered with threshold equal to 0.5; below this value the motion activity is not associated with any of the defined gestures

• The output of the system is the maximum of all rule outputs• Triangle membership functions used in the process of

fuzzification for all variables

Recognizing dynamic hand gestures (3)

• Description of fuzzy inference module in FCL (Fuzzy Control Language)

// beginning phase of left hand motion in right directionRULE 8 : IF directionLt0 IS North AND

directionLt1 IS East AND velocityLt0 IS NOT small AND velocityLt1 IS NOT smallAND velocityRt0 IS vsmall AND velocityRt1 IS vsmallTHEN gesture IS g2;

// middle phase of left hand motion in right direction RULE 9 : IF directionLt0 IS East AND directionLt1

IS EastAND velocityLt0 IS NOT small AND velocityLt1 IS NOT smallAND velocityRt0 IS vsmall AND velocityRt1 IS vsmallTHEN gesture IS g2;

 // ending phase of left hand motion in right direction

RULE 10 : IF directionLt0 IS East AND directionLt1 IS South

AND velocityLt0 IS NOT small AND velocityLt1 IS NOT smallAND velocityRt0 IS vsmall AND velocityRt1 IS vsmallTHEN gesture IS g2;

Recognizing dynamic hand gestures (4)

• Hand tracking supported by Kalman filters

ytt

ytt

xtt

xtt

ytt

FRtttt

xtt

FRtttt

tt f

ncyy

f

ncxx

111

111

11111

11111

1

1

0

1

0

ˆ

s

Recognizing dynamic hand gestures (5)

y

x

10

21

(x1, y1)

(x2, y2)

(x0, y0)

(x3, y3)

(x4, y4)

21u

10u

North East

East

43u

32u

y’

x’

sintxt

costyt

1111ˆˆ tttttt wsFs

10000100

010001dt

dt

FFRf

ncdt

1

0

• Examining Kalman filters applied to trajectory smoothing

Recognizing dynamic hand gestures (6)

Visualization of motion trajectories obtained for the system with Kalman filters (darker line) and system without Kalman filters (brighter line)

Presentation outline

1. Developed gesture recognition system

2. Background / foreground segmentation

3. Recognizing dynamic hand gestures

4.4. Recognizing static hand gesturesRecognizing static hand gestures

5. Efficiency

6. Video presentations

• Hand shape parameterized using PGH method (Pairwise Geometrical Histograms)

Recognizing static hand gestures (1)

Creating Pairwise Geometrical Histogram: a) calculating distances and angles between segments designated on object contour; b) two dimensional PGH (Bradski, 2008)

PGH

Representing hand shape using PGH

• To provide reliable gesture recognition it is essential to chose the optimal classifier

• experiments using WEKA application• Random Tree

• C4.5 (J48)

• Naive Bayes Net

• NNge

• Random Forest

• Artifical Neural Network

• Support Vector Machines

Recognizing static hand gestures (2)

Recognizing static hand gestures (3)

Classifier E [%] tT [ms] tK [ms] Parameters

Random Tree 77.04 443 3 k = 26, m = 2-17

C4.5 (J48) 77.73 1342 4 C = 2-7, m = 2

Naive Bayes Net 79.49 303 73 supervised discretization

NNge 83.47 14234 8073 g = 22, i = 24

Random Forest 89.91 59644 722 i = 29, k = 24, unlimited depth

Artificial Neural Network 91.67 1458 187

l = 2-3, m = 2-5,e = 23, one hidden layer, 4 nodes

SVM (LibSVM) 92.82 2508 1159 = 2-11, C = 211, RBF kernel

The results of classifiers examination

tT – average training time, tK – average validation time

• The SVM classifier of a type C-SVC (C-Support Vector Classification) with RBF kernel can be considered optimal

• The highest efficacy (SVM: 92,82%, ANN: 91,67%)• Lack of generalization effect typical for ANN classifier

Recognizing static hand gestures (4)

l

ii

T

bwC

1,, 2

1min

ww

li

by

i

iiT

i

,..,1,0

1

xw

0,),(2

ji xx

ji exxK

Presentation outline

1. Developed gesture recognition system

2. Background / foreground segmentation

3. Recognizing dynamic hand gestures

4. Recognizing static hand gestures

5.5. EfficiencyEfficiency

6. Video presentations

• Computer parameters:• Intel Core 2 Duo P7350 2.0 GHz• 400 MHz DDR2 RAM, 6:6:6:18 cycle latency• Windows Vista Business 32-bit

• Screen resolution: 1024 x 768 px• Processing frames of a size 320 x 240 px

Efficiency (1)

• Averaged execution times of most time consuming operations over 1000 iterations

• Obtained average frame rate: ~22 FPS

Operation Execution time [ms]

Capturing image displayed by the projectorCreateCompatibleBitmap

8,19

Median filtering(cvSmooth)

3,28

Perspective correction(cvWarpPerspective)

6,55

Color calibration(author’s method)

3,28

Efficiency (2)

Presentation outline

1. Developed gesture recognition system

2. Background / foreground segmentation

3. Recognizing dynamic hand gestures

4. Recognizing static hand gestures

5. Efficiency

6.6. Video presentationsVideo presentations

Virtual Whiteboard

Gesture Mixer

Thank you for your attention.