Prédiction du mouvement humain pour la robotique collaborative : du geste accompagné au mouvement corps entier.

Oriane Dermy 1
1 LARSEN - Lifelong Autonomy and interaction skills for Robots in a Sensing ENvironment
Inria Nancy - Grand Est, LORIA - AIS - Department of Complex Systems, Artificial Intelligence & Robotics
Abstract : This thesis lies at the intersection between machine learning and humanoid robotics, under the theme of human-robot interaction and within the cobotics (collaborative robotics) field. It focuses on prediction for non-verbal human-robot interactions, with an emphasis on gestural interaction. The prediction of the intention, understanding, and reproduction of gestures are therefore central topics of this thesis. First, the robots learn gestures by demonstration: a user grabs its arm and makes it perform the gestures to be learned several times. The robot must then be able to reproduce these different movements while generalizing them to adapt them to the situation. To do so, using its proprioceptive sensors, it interprets the perceived signals to understand the movement made by the user in order to generate similar ones later on. Second, the robot learns to recognize the intention of the human partner based on the gestures that the human initiates: the robot then has to perform the gestures adapted to the situation and corresponding to the user's expectations. This requires the robot to understand the user's gestures. To this end, different perceptual modalities have been explored. Using proprioceptive sensors, the robot feels the user's gestures through its own body: it is then a question of physical human-robot interaction. Using visual sensors, the robot interprets the movement of the user's head. Finally, using external sensors, the robot recognizes and predicts the user's whole body movement. In that case, the user wears sensors (in our case, a wearable motion tracking suit by XSens) that transmit his posture to the robot. In addition, the coupling of these modalities was studied. From a methodological point of view, the learning and the recognition of time series (gestures) have been central to this thesis. In that aspect, two approaches have been developed. The first is based on the statistical modeling of movement primitives (corresponding to gestures): ProMPs. The second adds Deep Learning to the first one, by using auto-encoders in order to model whole-body gestures containing a lot of information while allowing a prediction in soft real time. Various issues were taken into account during this thesis regarding the creation and development of our methods. These issues revolve around: the prediction of trajectory durations, the reduction of the cognitive and motor load imposed on the user, the need for speed (soft real-time) and accuracy in predictions.
Complete list of metadatas

Cited literature [238 references]  Display  Hide  Download

https://tel.archives-ouvertes.fr/tel-01966873
Contributor : Oriane Dermy <>
Submitted on : Sunday, December 30, 2018 - 3:18:38 AM
Last modification on : Wednesday, April 10, 2019 - 9:16:47 AM
Long-term archiving on : Sunday, March 31, 2019 - 12:28:55 PM

File

theseCOmpr.pdf
Files produced by the author(s)

Identifiers

  • HAL Id : tel-01966873, version 1

Citation

Oriane Dermy. Prédiction du mouvement humain pour la robotique collaborative : du geste accompagné au mouvement corps entier.. Intelligence artificielle [cs.AI]. Université de Lorraine, 2018. Français. ⟨NNT : 2018LORR0227⟩. ⟨tel-01966873⟩

Share

Metrics

Record views

277

Files downloads

298