Skip to Main content Skip to Navigation

Multimodalité de la communication langagière humaine : interaction geste/parole et encodage de distance dans le pointage

Abstract : Designating an object for the benefit of another person is one of the most basic processes inlinguistic communication. It is most of the time performed through the combined use of vocaland manual productions. The goal of this work is to understand and characterize the interactionsbetween speech and manual gesture during pointing tasks, in order to determine howmuch linguistic information is carried by each of these two systems, and eventually to test themain models of speech and gesture production.The first part of the study is about the production of vocal and manual pointing. The originalaspect of this work is to look for distance encoding parameters in the lexical, acoustic,articulatory and kinematic properties of multimodal pointing, and to show that these differentcharacteristics can be related with each other, and underlain by a similar basic motor behaviour: designating a distant object induces larger gestures, be they vocal or manual. This motorpattern can be related with the phonological pattern that is used for distance encoding in theworld’s languages. The experimental design that is used in this study contrasts bimodal vs. vocalmonomodal vs. monomodal manual pointings, and a comparison between these conditionsreveals that the vocal and manual modalities act in bidirectional cooperation for deixis, sharingthe informational load when used together.The second part of the study explores the development of multimodal pointing. The propertiesof multimodal pointing are assessed in 6-12 year-old children, in an experimental task similarto that of the adults. This second experiment attests a progressive evolution of speech/gestureinteractions in the development of spatial deixis. It reveals that distance is preferentially encodedin manual gestures in children, rather than in vocal gestures (and especially so in youngerchildren). It also shows that the cooperative use of speech and manual gesture in deixis is alreadyat play in children, though with more influence of gesture on speech than the reversedpattern.The third part of the study looks at sensorimotor interactions in the perception of spatial deixis.This experimental study, based on an intermodal priming paradigm, reveals that manual gestureplays a role in the production/perception mechanism associated with the semantic processingof language. These results can be related with those of studies on the sensorimotor nature ofrepresentations in the processing of linguistic sound units.Altogether, these studies provide strong evidence for an integrated representation of speech andmanual gestures in the human linguistic brain, even at a relatively early age in its development.They also show that distance encoding is a robust feature, which is also present in all aspectsof multimodal pointing.
Document type :
Complete list of metadatas
Contributor : Abes Star :  Contact
Submitted on : Wednesday, February 19, 2014 - 10:02:08 AM
Last modification on : Wednesday, October 14, 2020 - 3:51:36 AM
Long-term archiving on: : Monday, May 19, 2014 - 11:15:40 AM


Version validated by the jury (STAR)


  • HAL Id : tel-00949090, version 1


Chloe Gonseth. Multimodalité de la communication langagière humaine : interaction geste/parole et encodage de distance dans le pointage. Médecine humaine et pathologie. Université de Grenoble, 2013. Français. ⟨NNT : 2013GRENS011⟩. ⟨tel-00949090⟩



Record views


Files downloads