Skip to Main content Skip to Navigation

Modeling emotionnal facial expressions and their dynamics for realistic interactive facial animation on virtual characters

Abstract : In all computer-graphics applications, one stimulating task has been the integration of believable virtual characters. Above all other features of a character, its face is arguably the most important one since it concentrates the most essential channels of human communication. The road toward more realistic virtual characters inevitably goes through a better understanding and reproduction of natural facial expressiveness. In this work we focus on emotional facial expressions, which we believe represent the most interesting type of non-verbal facial communication. We propose an animation framework that learns practical characteristics of emotional facial expressions from human faces, and uses these characteristics to generate realistic facial animations for synthetic characters. Our main contributions are: - A method that automatically extracts a meaningful representation space for expressive facial deformations from the processing of actual data. This representation can then be used as an interface to intuitively manipulate facial expressions on any virtual character. - An animation system, based on a collection of motion models, which explicitly handles the dynamic aspect of natural facial expressions. The motion models learn the dynamic signature of expressions from data, and reproduce this natural signature when generating new facial movements. The obtained animation framework can ultimately synthesize realistic and adaptive facial animations in real-time interactive applications, such as video games or conversational agents. In Addition to its efficiency, the system can easily be associated to higher-level notions of human emotions; this makes facial animation more intuitive to non-expert users, and to affective computing applications that usually work at the semantic level.
Complete list of metadata

Cited literature [244 references]  Display  Hide  Download
Contributor : Myriam Andrieux Connect in order to contact the contributor
Submitted on : Monday, January 24, 2011 - 1:09:13 PM
Last modification on : Wednesday, April 27, 2022 - 4:13:27 AM
Long-term archiving on: : Monday, April 25, 2011 - 3:05:08 AM


  • HAL Id : tel-00558851, version 1


Nicolas Stoiber. Modeling emotionnal facial expressions and their dynamics for realistic interactive facial animation on virtual characters. Human-Computer Interaction [cs.HC]. Université Rennes 1, 2010. English. ⟨tel-00558851⟩



Record views


Files downloads