Abstract : For years, understanding and simulating human motion has become an important issue for various fields, such as computer animation, robotics, biomechanics, etc. In computer animation, the goal is to use this knowledge to create virtual humans as realistic as possible. With the amount of work already done, it is now possible for virtual humans to handle various tasks as realistically as a real human would do. One important key point in the realism of virtual motions is to verify dynamic laws. Every human being is subject to these laws. This PhD thesis deals with understanding the dynamics of virtual human motion and modeling simplified dynamics to reach interactive applications. This work is divided in three parts. First, we analyze human behavior to different dynamic constraints. Then, we evaluate users' perception of subtle differences in human motion due to dynamic constraints. Finally, we propose a new method for dynamic adaptation of human motion using the knowledge obtained from our previous studies. In the first part of this work, we present a study on measuring the dynamic balance of human motion. Using different criteria previously proposed in biomechanics and robotics literature, we evaluate their efficiency on different types of motions with varying degrees of dynamics. Then, we propose a preferential utilization range for each criterion. If dynamics plays an important role in the realism of virtual human motions, few works deal with the perception of dynamic properties. In order to define a perception threshold, we studied the accuracy of users to differentiate the mass of a dumbbell lifted either by a real (video) or a virtual human (animation). We observed that users do not manage to differentiate small mass differences. However, we did not perceive significant differences between comparing real videos or virtual animations. Thus, even if some information is lost in the process transforming a real motion into a virtual one, principal dynamic information are preserved and perceived by users. Regarding the accuracy reached by users on the perception of dynamic constraints, we proposed a new perception-based method for dynamic adaptation of human motions subject to new external constraints, using the knowledge obtained from the perception study. When a virtual character is subject to external perturbations that were not present in the original motion, our method separates postural and temporal adaptation of the motion. It introduces small inaccuracies but generates natural looking reactions to external perturbations while drastically decreasing computation time. Thus, the final motion is more dynamically correct than the original motion with the new perturbations, with low computation load.