Skip to Main content Skip to Navigation
Theses

Robotised tangible user interface for multimodal interactions in virtual reality : anticipating intentions to physically encounter the user

Abstract : Virtual Reality (VR) experiences are by essence multimodal: they heavily rely on the users’ senses. Yet, integrating Haptics - the sense of touch - in VR is a timely and challenging topic. The goal of this thesis is to provide consistent Haptic Feedback while enabling intuitive Interaction Techniques in Virtual environments. More specifically, I aim to let the users perform unencumbered interactions in VR - where they are free of any contraption - and promote the design of a Haptic Solution in these regards. To achieve this, I discriminate the Integration of Haptics in VR through a threefold approach : 1, I investigate how to provide believable fine tactile and large kinesthetic feedback ; 2, I depict interaction techniques in VR, the tasks users perform and draw novel methods to enable them. These enable me to draw an Analytical Framework with methodological contributions. This framework proposes novel dimensions to evaluate VR interactions via their associated Haptic Solutions, and emphasizes the promising future of Encountered-Type of Haptic Displays. It then depicts their specifications, challenges and opportunities from both conception and perception perspectives ; 3, I then provide artefact contributions, and propose the conception and design of a Robotised Tangible User Interface, CoVR. CoVR anticipates the users’ intentions both within- and between-objects, to physically encounter the users at their desired object of interest prior to interaction. This thesis finally provides empirical contributions, with technical and perceptual studies around CoVR.
Document type :
Theses
Complete list of metadata

https://tel.archives-ouvertes.fr/tel-03508552
Contributor : Abes Star :  Contact
Submitted on : Monday, January 3, 2022 - 4:59:08 PM
Last modification on : Friday, January 14, 2022 - 9:46:02 AM

File

BOUZBIB_Elodie_2021.pdf
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-03508552, version 1

Citation

Elodie Bouzbib. Robotised tangible user interface for multimodal interactions in virtual reality : anticipating intentions to physically encounter the user. Human-Computer Interaction [cs.HC]. Sorbonne Université, 2021. English. ⟨NNT : 2021SORUS228⟩. ⟨tel-03508552⟩

Share

Metrics

Les métriques sont temporairement indisponibles