A demo of a facial UI design approach for digital artists

2
A Demo of a Facial UI Design Approach for Digital Artists Pedro Bastos Instituto Telecomunicac ¸˜ oes Fac. Engenharia Univ. Porto [email protected] Xenxo Alvarez Instituto Telecomunicac ¸˜ oes Fac. Ciˆ encias Univ. Porto [email protected] Ver ´ onica Orvalho Instituto Telecomunicac ¸˜ oes Fac. Ciˆ encias Univ. Porto [email protected] ABSTRACT In the character animation industry, animators use facial UI’s to animate a character’s face. A facial UI provides widgets and handles that the animator interacts with to control the character’s facial regions. This paper presents a facial UI de- sign approach to control the animation of the six basic facial expressions of the anthropomorphic face. The design is based in square shaped widgets [5] holding circular handles that al- low the animator to produce the muscular activity relative to the basic facial expressions [1]. We have implemented a pro- totype of the facial UI design in the Blender open-source an- imation software and did a preliminary pilot study with three animators. Two parameters were evaluated: the number of clicks and the time taken to animate the six basic facial ex- pressions. The study reveals there was little variation in the values each animator marked for both parameters, despite the natural difference in their creative performance. Author Keywords Facial animation, user interfaces, widgets, handles. ACM Classification Keywords B.5.1 Design: Control Design INTRODUCTION Producing character facial animation in the entertainment in- dustry is a serious bottleneck for character animators [7]. A character’s face assumes complex expressions [3] which the animator needs to produce in tight production schedules. To animate the character’s expressions a facial user interface, or a facial UI, is used. This UI has widgets and handles that the animator manipulates to control the character’s expressions. We present a facial UI design approach to animate the six ba- sic facial expressions defined in [1] for the anthropomorphic face: surprise, fear, disgust, anger, happiness and sadness. An anthropomorphic face is any face with the anatomical charac- teristics of a human face (e.g. number, symmetry and position of facial regions). The goal was to realize if the UI design averaged the animators’ performance posing the expressions. We did a prototype (Fig. 1) and a pilot study with 3 animators. Copyright is held by the author/owner(s). IUI’12, February 14–17, 2012, Lisbon, Portugal. ACM 978-1-4503-1048-2/12/02. Figure 1. The design approach of the facial UI. The animators were introduced to the interface controls and to the guidelines defined by Ekman and Friesen [1] to animate the expressions. Following they were asked to reproduce the six basic facial expressions without any further guidance. To evaluate their performance using the controls we analyzed three parameters: the quality of the expressions, the number of clicks performed and the time taken to animate them. THE FACIAL UI DESIGN APPROACH The facial UI design is based in the widgets and handles found in Parke and Waters [5] work, which we adapted using the muscular activity information of the face, categorized by Ek- man and Friesen [2] into Action Units, Action Descriptors and Gross Behavior Codes. We filtered the muscular activity relative to the six basic facial expressions to reach a behav- ioral classification [6] of the handles and prevent the usual complexity found in this kind of animation interfaces [4]. We considered the information on skin deformation described by Ekman and Friesen [1] for each facial expression. For in- stance, the fact that the eyebrows appear curved during sur- prise, or the upper lip is raised during disgust, or the nose wrinkles and the cheeks raise during anger, or the corners of the lips are drawn back and slightly up during happiness. The result is an interface paradigm with 10 square shaped and 3 rectangle shaped widgets and a total of 20 handles. Each handle can be moved horizontally to change one parameter and vertically to change another. The handles control the eye- brows, eyelids, nose, cheeks, lips and jaw, which were defined by Ekman and Friesen [1] as the facial regions that allow the six basic facial expressions to be perceived. Demonstration IUI'12, February 14-17, 2012, Lisbon, Portugal 307

Transcript of A demo of a facial UI design approach for digital artists

Page 1: A demo of a facial UI design approach for digital artists

A Demo of a Facial UI Design Approach for Digital ArtistsPedro Bastos

Instituto TelecomunicacoesFac. Engenharia Univ. Porto

[email protected]

Xenxo AlvarezInstituto TelecomunicacoesFac. Ciencias Univ. Porto

[email protected]

Veronica OrvalhoInstituto TelecomunicacoesFac. Ciencias Univ. Porto

[email protected]

ABSTRACTIn the character animation industry, animators use facial UI’sto animate a character’s face. A facial UI provides widgetsand handles that the animator interacts with to control thecharacter’s facial regions. This paper presents a facial UI de-sign approach to control the animation of the six basic facialexpressions of the anthropomorphic face. The design is basedin square shaped widgets [5] holding circular handles that al-low the animator to produce the muscular activity relative tothe basic facial expressions [1]. We have implemented a pro-totype of the facial UI design in the Blender open-source an-imation software and did a preliminary pilot study with threeanimators. Two parameters were evaluated: the number ofclicks and the time taken to animate the six basic facial ex-pressions. The study reveals there was little variation in thevalues each animator marked for both parameters, despite thenatural difference in their creative performance.

Author KeywordsFacial animation, user interfaces, widgets, handles.

ACM Classification KeywordsB.5.1 Design: Control Design

INTRODUCTIONProducing character facial animation in the entertainment in-dustry is a serious bottleneck for character animators [7]. Acharacter’s face assumes complex expressions [3] which theanimator needs to produce in tight production schedules. Toanimate the character’s expressions a facial user interface, ora facial UI, is used. This UI has widgets and handles that theanimator manipulates to control the character’s expressions.

We present a facial UI design approach to animate the six ba-sic facial expressions defined in [1] for the anthropomorphicface: surprise, fear, disgust, anger, happiness and sadness. Ananthropomorphic face is any face with the anatomical charac-teristics of a human face (e.g. number, symmetry and positionof facial regions). The goal was to realize if the UI designaveraged the animators’ performance posing the expressions.We did a prototype (Fig. 1) and a pilot study with 3 animators.

Copyright is held by the author/owner(s).IUI’12, February 14–17, 2012, Lisbon, Portugal.ACM 978-1-4503-1048-2/12/02.

Figure 1. The design approach of the facial UI.

The animators were introduced to the interface controls andto the guidelines defined by Ekman and Friesen [1] to animatethe expressions. Following they were asked to reproduce thesix basic facial expressions without any further guidance. Toevaluate their performance using the controls we analyzedthree parameters: the quality of the expressions, the numberof clicks performed and the time taken to animate them.

THE FACIAL UI DESIGN APPROACHThe facial UI design is based in the widgets and handles foundin Parke and Waters [5] work, which we adapted using themuscular activity information of the face, categorized by Ek-man and Friesen [2] into Action Units, Action Descriptorsand Gross Behavior Codes. We filtered the muscular activityrelative to the six basic facial expressions to reach a behav-ioral classification [6] of the handles and prevent the usualcomplexity found in this kind of animation interfaces [4].

We considered the information on skin deformation describedby Ekman and Friesen [1] for each facial expression. For in-stance, the fact that the eyebrows appear curved during sur-prise, or the upper lip is raised during disgust, or the nosewrinkles and the cheeks raise during anger, or the corners ofthe lips are drawn back and slightly up during happiness.

The result is an interface paradigm with 10 square shaped and3 rectangle shaped widgets and a total of 20 handles. Eachhandle can be moved horizontally to change one parameterand vertically to change another. The handles control the eye-brows, eyelids, nose, cheeks, lips and jaw, which were definedby Ekman and Friesen [1] as the facial regions that allow thesix basic facial expressions to be perceived.

Demonstration IUI'12, February 14-17, 2012, Lisbon, Portugal

307

Page 2: A demo of a facial UI design approach for digital artists

Table 1. Statistics of the animators’ interaction with the facial UI.

THE PILOT STUDYThree animators were asked to test a prototype of the facial UIdesign approach. The animators’ interaction was evaluated bythe number of clicks performed and the time taken to producethe six basic facial expressions defined by Ekman and Friesen[1]. Table 1 shows the results of the animators’ interaction.

Animator 1 did 38 clicks in 101 seconds. Animator 2 did 39clicks in 106 seconds. Animator 3 did 42 clicks in 164 sec-onds. The number of clicks each animator did is very similarto each other and close to the optimal number of clicks (34).The time taken was similar for animators 1 and 2. Animator3 required more time to place the controls thus achieving themost accurate expressions, as seen in Fig. 2. Despite Anima-tor 3 required more time, the number of clicks performed didnot change significantly from Animators 1 and 2.

Fig. 2 illustrates, in the top row, the optimal poses we builtfor each facial expression following the guidelines defined byEkman and Friesen [1]. We registered our performance as theoptimal number of clicks seen in Table 1. The animators weregiven these guidelines as well and their results are shown inFig. 2. In overall, the animators were able to simulate the sixbasic facial expressions with increased accuracy.

CONCLUSION AND FUTURE WORKWe designed a facial user interface paradigm that approxi-mattes the number of clicks and time that animators requireto animate the six basic facial expressions defined by Ekmanand Friesen [1] for the anthropomorphic face. The paradigmsuccessfully averaged the number of clicks for the simulationof the six basic facial expressions.

Fig. 2 shows that fear was the single facial expression that theanimators could not simulate as well as its corresponding op-timal expression. We believe this is because animators did notconsider that the middle portion of the eyebrows should raiseonly during surprise [1], whereas during fear the eyebrowsshould appear straightened [1]. For this reason, the anima-tors ended up doing an expression of fear which is somewhatsimilar to that of surprise.

We continue working to improve and extend the interface de-sign to go beyond the six basic expressions and into the en-tire behavioral set for the anthropomorphic face defined inthe Facial Action Coding System [2]. We are also workingto provide animators the possibility to customize each handleaccording to the target emotion to be expressed.

Figure 2. Comparison of the animators’ results.

We provide an audiovisual format of the user interface proto-type running in the latest version of the Blender open-sourceanimation software.

ACKNOWLEDGMENTSThis paper is financially supported by Fundacao para a Cienciae a Tecnologia (FCT), grant SFRH/BD/69878/2010, by In-stituto de Telecomunicacoes (IT) and partially by the Euro-pean Union FP7 Integrated Project VERE (No. 257695). Theface shown in the figures was generated in the MakeHuman c©open-source modeling software, licensed under the GNU Gen-eral Public License. We thank the animators that tested theprototype: Jose Ricardo Silva, Nuno Vasco Costa and PedroMarques Pereira. We also thank Jose Serra for his support.

REFERENCES1. Ekman, P., and Friesen, W. V. Unmasking the Face: A

Guide to Recognizing Emotions from Facial Clues.Prentice-Hall, 1975. pp. 34-128.

2. Ekman, P., and Friesen, W. V. Manual of the FacialAction Coding System. Consulting Psychologists Press,Stanford University, 1977. pp. 15-347.

3. Hauser, T. The Pixar Treasures. Disney Editions DeluxeScrapbook, 2011.

4. Matejic, L. LOG: Building 3D User Interface Widgets byDemonstration. Technical Report for the Department ofComputer Science, Brown University Providence, p. 2(1993).

5. Parke, F., and Waters, K. Computer Facial Animation.A.K. Peters, 2008. p. 153.

6. Wang, L. J., Sajeev, A. S. M., and Inchaiwong, L. AFormal Specification of Interaction Widgets HierarchyFramework. Proc. 3rd International Conference onInformation Technology: New Generations. IEEEComputer Society, p. 1 (2006).

7. Williams, R. The Animator’s Survival Kit. New YorkUniversity, 2001. pp. 246-250.

Demonstration IUI'12, February 14-17, 2012, Lisbon, Portugal

308