Utilisation de l'eye-tracking pour l'interaction mobile dans un environnement réel augmenté

Qinjie Ju 1, 2, 3
2 imagine - Extraction de Caractéristiques et Identification
LIRIS - Laboratoire d'InfoRmatique en Image et Systèmes d'information
3 SICAL - Situated Interaction, Collaboration, Adaptation and Learning
LIRIS - Laboratoire d'InfoRmatique en Image et Systèmes d'information
Abstract : Eye-tracking has a very strong potential in human computer interaction (HCI) as an input modality, particularly in mobile situations. In this thesis, we concentrate in demonstrating this potential by highlighting the scenarios in which the eye-tracking possesses obvious advantages comparing with all the other interaction modalities. During our research, we find that this technology lacks convenient action triggering methods, which can scale down the performance of interacting by gaze. In this instance, we investigate the combination of eye-tracking and fixed-gaze head movement, which allows us to trigger various commands without using our hands or changing gaze direction. We have proposed a new algorithm for fixed-gaze head movement detection using only scene images captured by the scene camera equipped in front of the head-mounted eye-tracker, for the purpose of saving computation time. To test the performance of our fixed-gaze head movement detection algorithm and the acceptance of triggering commands by these movements when the user's hands are occupied by another task, we have implemented some tests in the EyeMusic application that we have designed and developed. The EyeMusic system is a music reading system, which can play the notes of a measure in a music score that the user does not understand. By making a voluntary head movement when fixing his/her gaze on the same point of a music score, the user can obtain the desired audio feedback. The design, development and usability testing of the first prototype for this application are presented in this thesis. The usability of our EyeMusic application is confirmed by the experimental results, as 85% of participants were able to use all the head movements we implemented in the prototype. The average success rate of this application is 70%, which is partly influenced by the performance of the eye-tracker we use. The performance of our fixed-gaze head movement detection algorithm is 85%, and there were no significant differences between the performance of each head movement. Apart from the EyeMusic application, we have explored two other scenarios that are based on the same control principles: EyeRecipe and EyePay, the details of these two applications are also presented in this thesis.
Document type :
Theses
Complete list of metadatas

Cited literature [179 references]  Display  Hide  Download

https://tel.archives-ouvertes.fr/tel-02166965
Contributor : Abes Star <>
Submitted on : Thursday, June 27, 2019 - 12:33:44 PM
Last modification on : Thursday, November 21, 2019 - 2:11:09 AM

File

TH_T2670_qju.pdf
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-02166965, version 1

Citation

Qinjie Ju. Utilisation de l'eye-tracking pour l'interaction mobile dans un environnement réel augmenté. Autre. Université de Lyon, 2019. Français. ⟨NNT : 2019LYSEC011⟩. ⟨tel-02166965⟩

Share

Metrics

Record views

104

Files downloads

141