Skip to Main content Skip to Navigation
Theses

Performance transfer : animating virtual charaters by playing and acting.

Abstract : During the past decades 3D animation widely spread into our everyday life being for entertainment such as video games or movies, or for communication more generally.Despite its common use, creating animations is still dedicated to skilled animators and not within reach of non-experts. In addition, the creation process traditionally follow a strict pipeline: after the initial storyboarding and character design phases, articulated characters are modeled, rigged and roughly positioned in 3D, at which point the layout of their relative poses can be established. Then, their actions and movements are broken down into keyframes which are interpolated to produce the final animation.Keyframing animations is a hard process during which animators need to spend time and efforts carefully tuning animation curves for each character's degrees of freedom and for each specific action. They also need to consistently sequence these actions over time in order to convey the desired character's intentions and personality.Some methods such as motion capture, aim at easing the creation process by directly transferring real recorded human motions to virtual characters. However, output animations still lack expressiveness and must be fixed by animators.In this thesis, we propose a way of easing the process of creating animation sequences starting from a database of individual animations. We focus on animating virtual characters reproducing a story played with props such as figurines instrumented with sensors. The main goal of this thesis is to compute expressive and plausible animations from the data provided by the sensors in order to transpose the story told by the narrator and its figurines into the virtual world. To reach this goal, we propose a new animation pipeline analyzing and transcribing hands motion into a sequence of animations adapted to the user hand space-time trajectories and their motion qualities. We present a new translation, rotation and scale invariant motion descriptor allowing our system to perform action recognition as well as two different classifiers extracting Laban Effort motion qualities from an input curve.We also introduce a new procedural animation model inferring expressiveness fitting the narrator's hand motion qualities in terms of Laban Time and Weight Effort. Finally, we extend our system such that it can process multiple characters at a time, detecting and transferring interactions as well as making characters act with respect to pre-defined behaviors, letting users express their narrative creativity.Our system capabilities were tested through different user studies during which non expert users could follow a script or freely improvise stories with two figurines at a time.We conclude with a discussion of future research directions.
Document type :
Theses
Complete list of metadatas

Cited literature [98 references]  Display  Hide  Download

https://tel.archives-ouvertes.fr/tel-02934748
Contributor : Abes Star :  Contact
Submitted on : Wednesday, September 9, 2020 - 4:00:36 PM
Last modification on : Wednesday, October 14, 2020 - 4:16:36 AM

File

GARCIA_2019_diffusion.pdf
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-02934748, version 1

Collections

Citation

Maxime Garcia. Performance transfer : animating virtual charaters by playing and acting.. Image Processing [eess.IV]. Université Grenoble Alpes, 2019. English. ⟨NNT : 2019GREAM074⟩. ⟨tel-02934748⟩

Share

Metrics

Record views

73

Files downloads

25