Journée Inter-GDR ISIS et Robotique: Interaction Homme-Robot

Post on 11-May-2015

383 views 1 download

Tags:

Transcript of Journée Inter-GDR ISIS et Robotique: Interaction Homme-Robot

Model of expressive gestures for humanoid robot NAO

Quoc Anh LeTelecom ParisTech

Catherine Pelachaud

CNRS, Telecom ParisTech

NAO robot

• An autonomous, programable and medium-size humanoid robot (57 cm)• Made by a French company (Aldebaran Robotics, Paris)• 25 degrees of freedom• 2 speakers to speak or play sound files• Control colors of its eyes

3

Introduction

• This work is part of the National project ANR GVLEX – Partners: Aldebaran (coordinator), Acapela, LIMSI

and Telecom-ParisTech.• Objective: Have the humanoid robot, Nao, read

story agreeably through expressive verbal and non-verbal behaviors.

• Methodology: Select and compute behaviors based on information extracted from the story: its structure, its various semantic and pragmatic elements as well as its emotional content.

• Ongoing work at the Telecom-ParisTech: Control gestures of Nao by using the platform of an existing virtual agent system, Greta

Greta platform

• Following SAIBA framework• Two representation languages:

– FML: Function Markup Language– BML: Behavior Markup Language

4

SAIBA framework• Goal: Define interfaces at separate levels of abstraction for

multimodal behavior generation• Structure: consist of 3 separated modules

– Intent planning: Planning of a communicative intent– Behavior planning: Planning of multimodal behavior that carry– Behavior realization: Realization of the planned behaviors

5

FML

• Objective: Encode communicative and expressive intent what agent aims to transmit. It may be emotional states, beliefs or goals, etc.

6

BML

• Objective: Specify multimodal behaviors with constrains to be realized by agent

• Description:

1. Occurrence of behaviors2. Relative timings of behaviors

3. Form of behaviors/Reference to predefined animations

4. Conditions/Events/Feedbacks

7

21

3

Steps1. Build a repertoire of gestures based on a video corpus

2. Use Greta platform to compute gestures for robot

Text Intent Planning

Behavior Planning

Behavior Realizer

FMLBML

BMLBehavior Realizer

Build repertoire of gestures

• Goal: Collect expressive gestures of individuals in a specified context (story-tellers)

• Stages:

1. Video collection2. Code schema and annotations

3. Elaboration of symbolic gestures

9

Videos corpus

Gesture Repertoire

GestureEditor

annotations elaboration

Video collection• 6 actors from an amateur troupe were

videotaped • Actors had received the script of the story

beforehand• The text was displayed during the session so

that they could read it from time to time

• 2 digital cameras were used (front and side-view)

• Each actor was videotaped twice – 1st session as a training / warm-up session– the most expressive session can be kept for analysis

10Martin

Video corpus

11

• Total duration: 80mn

• Average: 7 mn per story

11Martin

Code schema and annotation

• Code schema– Goal: enable specification of gesture lexicons for Greta and Nao– Segmentation based on gesture phrases– Attributes

• Handedness : Right hand / Left hand / 2 hands• Category: deictic, iconic, metaphoric, beat, emblem

(McNeill 05, Kendon 04)• Lexicon: 47 different entries

• Annotations using Anvil tool (Kipp 01)– Current state: 125 gestures segmented for 1 actor– Rich in terms of gestures : 23 gestures per minutes for subject

12Martin

Annotation

13Martin

Gesture Editor• Gesture described

symbolically:– Gesture phases:

preparation, stroke, hold, relaxation

– Wrist position– Palm orientation– Finger orientation– Finger shape– Movement trajectory– Symmetry (one hand, two

hand,..)14

15

Predefined positions

• Pre-calculate joint values of all combinations of hand positions in 3D space (vertical, horizontal, distance) using Choregraphe.

• Current state: 105 positions corresponding to 7 vertical values, 5 horizontal values and 3 distance values (000, 001, … , 462).

• Replace symbolic positions by real joint values when compiling. 16

Predefined animations• Pre-programmed gestures or behavioral scripts using

Choregraphe• Make reference to behavioral scripts from BML.

17

Behavioral scripts using ChoreGraphe

18

Voilà bien longtemps, un soir de printemps, trois petits morceaux de nuit se détachèrent du ciel et tombèrent sur Terre….

Gesture lexicon

• Different degrees of freedom• Variant of a gesture encompasses a family of gestures that shares

– the same meaning (e.g. to stop someone)– a core signal (e.g. vertical flat hand toward the other)

• Gestures within a family may differ along the non-core signals they use

• Construction of a common lexicon with– Greta-Gestuary– Nao-Gestuary

• In the specific lexicon, variant shares similar meaning and signal-core.

19

Robot vs. Greta

• Degree of freedoms• Not dynamic wrists• Three fingers that open or close together• Movement speed (>0.5 seconds)• Singular positions=> Gestures may not be identical but should convey similar meaning

20

Gesture: Stop

21

Gesture: Stop

22

Demo: Three pieces of night

23

Demo: Choregraphe script

24

25

Future work

• Synchronization of gestures with speech for robot

• Define invariant signification of gestures• Define different levels of BML descriptions

for gestures• Define evaluation methods

26

Thank You for Your Attention

Any questions