© Imperial College LondonPage 1 FERA2011: The First Facial Expression Recognition and Analysis...

13
© Imperial College London Page 1 FERA2011: The First Facial Expression Recognition and Analysis Challenge FG’11 March 2011 Michel Valstar, Marc Méhu, Marcello Mortillaro, Maja Pantic, Klaus Scherer

Transcript of © Imperial College LondonPage 1 FERA2011: The First Facial Expression Recognition and Analysis...

© Imperial College LondonPage 1

FERA2011: The First Facial Expression Recognition and Analysis Challenge

FG’11 March 2011Michel Valstar, Marc Méhu, Marcello Mortillaro, Maja Pantic, Klaus Scherer

Participation overview

• Data downloaded by 20 teams• 15 submissions• 11 accepted papers• 13 teams in Emotion Sub-Challenge• 5 teams in AU Sub-Challenge• Institutes from 6 countries• 53 researchers, median of 6 per paper• 5 entries were multi-institute endeavours

© Imperial College LondonPage 2

Trends

© Imperial College LondonPage 3

Machine Learning trends:• 13/15 teams used SVM• Three teams used multiple kernel SVMs, including the AU

winner• Only 1 team modelled time• Only 1 team used probabilistic graphical models

Feature trends: • 4 teams encode appearance dynamics• 4 teams use both appearance and geometric features (including

AU winners)• Only 1 team infers 3D, but appears successful! (AU winner)• Only 1 team uses Geometric features only, ranked 11th

Baseline System – LBP based Expression Recognition

© Imperial College LondonPage 4

• Face is registered using detected eyes.• Uniform Local Binary Pattern features are

computed on every pixel (LBP).• Face is divided in 10x10 blocks. In each block

a 256 bin histogram of the LBP features is generated.

• For every AU a GentleBoost-SVM is learned. Upper face AUs use the concatenated histograms of the top five rows, Lower face AUs the bottom five rows.

• For every Emotion a GentleBoost-SVM is learned using all rows. SVM predictions are per frame, decision is made by voting.

Local Binary Pattern appearance descriptors are applied to the face region to detect AUs and discrete emotions

Baseline Overview (LAUD)

© Imperial College LondonPage 5

B. Jiang, M.F. Valstar, and M. Pantic, “Action Unit detection using sparse appearance descriptors in space-time video volumes”, FG’11

Winner of the Emotion Detection sub-challenge

3. Karlsruhe Institute of TechnologyTobias Gehrig, Hazim Ekenel

© Imperial College LondonPage 6

2. UIUC-UMCUsman Tariq, Xi Zhou, Kai-Hsiang Lin, Zhen Li, Zhaowen Wang, Vuang Le, Thomas Huang, Tony Han, Xutao Lv

1. University of California, RiversideSongfan Yang, Bir Bhanu

Ranking – Emotion Sub-challenge

© Imperial College LondonPage 7

Person independent/specific emotion detection

© Imperial College LondonPage 8

Emotion secondary test results

© Imperial College LondonPage 9

Winner of the Action Unit Detection sub-challenge

3. Karlsruhe Institute of TechnologyTobias Gehrig, Hazim Ekenel

© Imperial College LondonPage 10

2. University of California San DiegoNicholas Butko, Javier Movellan, Tingfan Wu, Paul Ruvolo, Jacob Whitehill, Marian Bartlett

1. University of French West Indies & GuyanaLionel Prevost, Thibaud Senechal, Vincent Rapp, Hanan Salam, Renaud Seguier, Keving Bailly

Ranking – Action Unit Sub-challenge

BaselineMIT-Cambridge

ChewKIT

UCSDIRIS

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7

F1-Measure

F1-Measure

© Imperial College LondonPage 11

Person independent/specific AU detection

Baseline

MIT-Cambridge

U. Brisbane

KIT

UCSD

IRIS

0 0.1 0.2 0.3 0.4 0.5 0.6 0.7

SpecificIndependent

© Imperial College LondonPage 12

Conclusion and new goalsConclusions:• Person dependent discrete emotion detection is incredibly

successful• Dynamic appearance is very successful• Combined appearance/geometric approaches seem to be the

way forward• AU detection far from solved

© Imperial College LondonPage 13

New avenues:• Given the high success of discrete emotion, dimensional affect

may be a new goal to pursue• Explicitly detecting temporal segments of facial expressions• Analyse sensitivity of approaches to AU intensities. • Leverage person specific approaches for AU detection• Detection of AU intensity levels