.. .. Le,

, Exemple de EOG : les électrodes sont fixées sur la peau autour des yeux

, Exemples d'eye-trackers : (a) eye-tracker stationnaire placé endessous d'un écran, (b) eye-tracker portable sous forme d'une paire de lunettes

. L'eye, Tracking Glasses 2 Wireless de SMI. Les deux caméras des yeux filment les mouvements des yeux tandis que la caméra de la scène capture l'image regardée

E. Le-système, premier écran des 2 niveaux , (b) l'écran après la sélection du groupe 'ABCDE'

, Application EyeDraw avec un dessin fait par l'un des auteurs, p.17

, Un système d'eye-tracking proposé par Tobii pour les personnes à mobilité réduite

. .. Une-capture-d'écran-de-gazegalaxy, , p.19

, EyeTuner composé d'un haut-parleur et d'un ECS digital, p.20

, Deux applications de SideWays : (a) navigateur de couverture d'albums, (b) quiz du regard

, EyeGuide assiste l'utilisateur en station de métro pour la recherche de la bonne route

, Système de traduction à l'aide d'un eye-tracker porté sur la tête avec une caméra frontale

, Exemple de EyeBook : lorsque l'utilisateur lit une annotation, l'évé-nement correspondant se déclenche

, Exemple de séparation de la cible à sélectionner en 2 parties : le nom de la commande et la zone de sélection

, Sélection par deux fixations successives adoptée dans NeoVisus, 29 Table des figures

, Système MDITIM : (a) emplacement des cibles, (b) exemples

, Système EyeWrite : (a) fenêtre dédiée à l'exécution et à l'interpréta-tion des gestes, (b) interprétations des caractères, p.31

, Système Eye-S : (a) exemples des séquences des mouvements des yeux pour 'a' et 'b', (b) affichage explicite des cibles, p.31

, Exemples de séquences d'un mouvement d'oeil non symbolique, p.32

, Un utilisateur est intéressé par un album, il suit le mouvement de cet album sur l'écran par ses yeux ; un extrait de la musique est alors joué automatiquement

. .. , Les zooms et les pans lors de la sélection de la lettre 's', p.34

, Exemple de l'augmentation du volume d'un lecteur de musique d'une montre intelligente

D. Quatre-prototypes, AmbiGaze : (1) lecteur de la musique (virtuel+externe), (2) interface vidéo-sur-demande (virtuel+interne), (3) lampe (mécanique+interne), (4) ventilateur (mécanique+externe), p.36

, Représentation schématique des informations issues du geste pichenette et du regard, la ligne (f) représente la direction du geste, la ligne (g) représente le regard de l'utilisateur

, Rotations du poignet selon les 3 axes

, Quatre types de rotation de cheville

. .. Lignes, , p.45

. .. Geogazemarks, , vol.46

, Les mouvements volontaires de la tête à regard fixe choisis, p.54

, Variation du point du regard lors des mouvements verticaux de la tête à regard fixe

, Variations des coordonnées du point du regard pendant un mouvement M24 à regard fixe : (a) variation de la coordonnée y, (b) dérivée de la coordonnée y, (c) variation de la coordonnée y relative, p.57

, Configuration typique d'EyeMusic pour apprendre à jouer du piano, p.65

, (a) exemple de la partition avec 2 marqueurs ArUco, (b) exemple de l'image de la scène avec le point du regard superposé alors qu'une partie de la partition se situe en dehors du champs de vision de la caméra de la scène

, Vue d'ensemble de l'algorithme principal d'EyeMusic, p.70

, Diagramme schématique de la correspondance entre les mouvements et les commandes

, échoués pour chaque mouvement de la tête à regard fixe

, Histogramme du taux de réussite pour jouer la bonne mesure, p.80

, Histogramme du taux de réussite pour jouer la bonne mesure avec le bon mouvement détecté

, échoués pour chaque mouvement de la tête à regard fixe

, Histogramme du taux de réussite des algorithmes de la reconnaissance des mouvements de la tête

, Nombre d'essais réussis après 1, 2, 3, 4, 5 mouvements effectifs ou échoués pour chaque mouvement de la tête à regard fixe, p.85

J. Alexander, T. Han, and W. Judd, Pourang Irani, and Sriram Subramanian. Putting your best foot forward : investigating real-world mappings for foot-based gestures, Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, p.42, 2012.

A. Vasileios, P. Anagnostopoulos, and . Kiefer, Towards gaze-based interaction with urban outdoor spaces, Proc. of the 2016 ACM Int. Joint Conf. on Pervasive and Ubiquitous Computing : Adjunct, vol.21, p.23, 2016.

F. Anderson, T. Grossman, J. Matejka, and G. Fitzmaurice, Youmove : enhancing movement training with an augmented reality mirror

, Proc. of the 26th annual ACM Symp. on User interface software and technology, p.44, 2013.

T. Bahill, R. Michael, L. Clark, and . Stark, The main sequence, a tool for studying human eye movements, Mathematical Biosciences, vol.24, issue.3-4, pp.191-204, 1975.

D. Beymer, M. Daniel, and . Russell, Webgazeanalyzer : a system for capturing and analyzing web reading behavior using eye gaze, CHI'05 extended abstracts on Human factors in computing systems, p.45, 2005.

R. Biedert, G. Buscher, and A. Dengel, The eyebook-using eye tracking to enhance the reading experience, Informatik-Spektrum, vol.33, issue.3, p.25, 2010.

P. Biswas and P. Langdon, Eye-gaze tracking based interaction in india, Proc. of the 6th Int. Conf. on Intelligent Human Computer Interaction, p.44, 2014.

D. Bonino, E. Castellina, F. Corno, A. Gale, K. Garbo et al., A blueprint for integrated eye-controlled Bibliographie environments, Universal Access in the Information Society, vol.8, issue.4, p.106, 2009.

S. Brewster, J. Lumsden, M. Bell, M. Hall, and S. Tasker, Multimodal 'eyes-free' interaction techniques for wearable devices, Proc. of the SIGCHI Conf. on Human factors in computing systems, p.43, 2003.

A. Bulling, D. Roggen, and G. Tröster, Wearable EOG Goggles : Eye-based interaction in everyday environments, CHI Conf. on Human Factors in Computing Systems, p.32, 2009.

R. Carpenter, Movements of the Eyes, 2nd Rev. Pion Limited, 1988.

R. Carpenter, The visual origins of ocular motility. Vision and visual dysfunction, vol.8, pp.1-10, 1991.

N. Tom, . Cornsweet, and . Hewitt-d-crane, Accurate two-dimensional eye tracker using first and fourth purkinje images, JOSA, vol.63, issue.8, pp.921-928, 1973.

A. Crossan, S. Brewster, and A. Ng, Foot tapping for mobile interaction, Proc. of the 24th BCS Interaction Specialist Group Conf, p.41, 2010.

A. Crossan, M. Mcgill, S. Brewster, and R. Murraysmith, Head tilting for interaction in mobile contexts, Proc. of the, p.11

. Int and . Conf, on Human-Computer Interaction with Mobile Devices and Services, p.43, 2009.

A. Crossan, J. Williamson, S. Brewster, and R. Murraysmith, Wrist rotation for interaction in mobile contexts, Proc. of the, p.10

. Int and . Conf, on Human computer interaction with mobile devices and services, p.40, 2008.

J. De-lemos, G. R. Sadeghnia, Í. Ólafsdóttir, and O. Jensen, Measuring emotions using eye tracking, Proc. of measuring behavior, vol.226, pp.225-226, 2008.

. Bibliographie,

W. Delamare, Interaction à distance en environnement physique augmenté, p.44, 2015.

R. Dodge and T. Sparks-cline, The angular velocity of eye movements, Psychological Review, vol.8, issue.4, pp.145-157, 1901.

H. Drewes, Eye gaze tracking for human computer interaction, vol.18, p.133, 2010.

H. Drewes, A. De-luca, and A. Schmidt, Eye-gaze interaction for mobile phones, Proc. of the 4th Int. Conf. on mobile technology, applications, and systems and the 1st Int. Symp. on Computer human interaction in mobile technology, vol.23, p.32, 2007.

H. Drewes and A. Schmidt, Interacting with the computer using gaze gestures, IFIP Conf. on Human-Computer Interaction, p.32, 2007.

H. Drewes and A. Schmidt, The MAGIC touch : Combining magicpointing with a touch-sensitive mouse, IFIP Conf. on Human-Computer Interaction, vol.37, p.38, 2009.

T. Andrew and . Duchowski, A breadth-first survey of eye-tracking applications, Behavior Research Methods, Instruments, & Computers, vol.34, p.27, 2002.

T. Andrew and . Duchowski, Eye tracking methodology, Theory and practice, vol.328, p.29, 2007.

M. Eaddy, G. Blasko, J. Babcock, and S. Feiner, My own private kiosk : Privacy-preserving public displays, Wearable Computers, vol.1, p.24, 2004.

R. Engbert and K. Mergenthaler, Microsaccades are triggered by low retinal image slip, Proc. of the National Academy of Sciences, vol.103, pp.7192-7197, 2006.

A. Esteves, E. Velloso, A. Bulling, and H. Gellersen, Orbits : Gaze interaction for smart watches using smooth pursuit eye moveBibliographie ments, Proc. of the 28th Annual ACM Symp. on User Interface Software & Technology, p.35, 2015.

A. Feldman, E. M. Tapia, S. Sadi, P. Maes, and C. Schmandt, Reachmedia : On-the-move interaction with everyday objects, Ninth IEEE Int. Symp. on Wearable Computers (ISWC'05), pp.52-59

, IEEE, p.40, 2005.

D. Fono and R. Vertegaal, Eyewindows : evaluation of eye-controlled zooming windows for focus selection, Proc. of the SIGCHI Conf. on Human factors in computing systems, vol.18, p.37, 2005.

C. Forster, M. Pizzoli, and D. Scaramuzza, Svo : Fast semidirect monocular visual odometry, IEEE Int. Conf. on Robotics and Automation (ICRA), p.108, 2014.

S. Garrido-jurado, R. Muñoz-salinas, F. Madrid-cuevas, and M. , Automatic generation and detection of highly reliable fiducial markers under occlusion, Pattern Recognition, vol.47, issue.6, p.67, 2014.

I. Giannopoulos, P. Kiefer, and M. Raubal, Geogazemarks : providing gaze history for the orientation on small display maps, Proc. of the 14th ACM Int. Conf. on Multimodal interaction, p.46, 2012.

I. Giannopoulos, P. Kiefer, and M. Raubal, Gazenav : Gazebased pedestrian navigation, Proc. of the 17th Int. Conf. on HumanComputer Interaction with Mobile Devices and Services, p.23, 2015.

J. Gips and P. Olivieri, Eagleeyes : An eye control system for persons with disabilities, The Eleventh Int. Conf. on Technology and Persons with Disabilities, p.16, 1996.

E. Grandjean and . Kroemer, Fitting the task to the human : a textbook of occupational ergonomics, p.40, 1997.

. Bibliographie,

J. Hales, D. Rozado, and D. Mardanbegi, Interacting with objects in the environment by gaze and hand gestures, Proc. of the 3rd Int. workshop on pervasive eye tracking and mobile eye-based interaction, p.44, 2013.

T. Han, J. Alexander, A. Karnik, P. Irani, and S. Subramanian, Kick : investigating the use of kick gestures for mobile interactions, Proc. of the 13th Int. Conf. on Human Computer Interaction with Mobile Devices and Services, p.42, 2011.

D. Witzner-hansen, H. T. Henrik, J. Skovsgaard, E. Paulin-hansen, and . Mollenbach, Noise tolerant selection by gaze-controlled pan and zoom in 3D

, Proc. of the 2008 Symp. on Eye tracking research & applications, p.33, 2008.

H. Hartridge and . Thomson, Methods of investigating eye movements, The British journal of ophthalmology, vol.32, issue.9, p.581, 1948.

F. Hatfield, System and method for controlling host system interface with point-of-interest data, US Patent, vol.6, issue.2, p.37, 2001.

A. Hornof, A. Cavender, and R. Hoselton, Eyedraw : a system for drawing pictures with eye movements, ACM SIGACCESS Accessibility and Computing, vol.16, p.27, 2004.

P. Thomas-e-hutchinson, . White, N. Worthy, . Martin, C. Kelly et al., Human-computer interaction using eye-gaze input, IEEE Transactions on systems, man, and cybernetics, vol.19, issue.6, p.16, 1989.

P. Isokoski, Text input methods for eye trackers using off-screen targets, Proc. of the 2000 Symp. on Eye tracking research & applications, p.30, 2000.

P. Isokoski and R. Raisamo, Device independent text input : A rationale and an example, Proc. of the working Conf. on Advanced visual interfaces, p.30, 2000.

A. Howell-istance, L. Hyrskykari, S. Immonen, S. Mansikkamaa, and . Vickers, Designing gaze gestures for gaming : an investigation Bibliographie of performance, Proc. of the 2010 Symp. on Eye-Tracking Research & Applications, p.29, 2010.

J. K. Robert and . Jacob, What you look at is what you get : eye movement-based interaction techniques, Proc. of the SIGCHI Conf. on Human factors in computing systems, p.27, 1990.

J. K. Robert and . Jacob, Eye tracking in advanced interface design. Virtual environments and advanced interface design, pp.258-288, 1995.

J. K. Robert, K. Jacob, and . Karn, Eye tracking in human-computer interaction and usability research : Ready to deliver the promises, The mind's eye, vol.4, p.5, 2003.

Q. Ju, R. Chalon, and S. Derrode, Fixed-gaze head movement detection for triggering commands, Workshop on Models and Analysis of Eye Movements, p.49, 2018.
URL : https://hal.archives-ouvertes.fr/hal-01842324

Q. Ju, R. Chalon, and S. Derrode, Assisted music score reading using fixed-gaze head movement : Empirical experiment and design implications, Proc. of the ACM on Human-Computer Interaction, vol.3, p.65, 2019.
URL : https://hal.archives-ouvertes.fr/hal-02115757

Q. Ju, S. Derrode, and R. Chalon, Utilisation de l'eye-tracking pour l'interaction mobile dans un environnement réel augmenté. In 29ème conférence francophone sur l'Interaction Homme-Machine, page 5p, p.65, 2017.

H. Charles, . Judd, N. Cloyd, W. M. Mcallister, and . Steele, General introduction to a series of studies of eye movements by means of kinetoscopic photographs, Psychological Review Monographs, vol.7, issue.1, pp.1-16, 1905.

J. Kela, P. Korpipää, J. Mäntyjärvi, S. Kallio, G. Savino et al., Accelerometer-based gesture control for a design environment, Personal and Ubiquitous Computing, vol.10, issue.5, p.39, 2006.

P. Kiefer, I. Giannopoulos, D. Kremer, C. Schlieder, and M. Raubal, Starting to get bored : An outdoor eye tracking study Bibliographie of tourists exploring a city panorama, Proc. of the Symp. on Eye Tracking Research and Applications, p.47, 2014.

R. Kjeldsen, Head gestures for computer control, Proc. of IEEE ICCV Workshop on Recognition, Analysis, and Tracking of Faces and Gestures in Real-Time Systems, p.43, 2001.

T. Kobayashi, T. Toyamaya, F. Shafait, M. Iwamura, K. Kise et al., Recognizing words in scenes with a headmounted eye-tracker, 10th IAPR Int. Workshop on Document Analysis Systems (DAS), p.25, 2012.

B. David, C. J. Koons, K. Sparrell, and . Thorisson, Integrating simultaneous input from speech, gaze, and hand gestures, p.43, 1993.

J. Richard, S. G. Krauzlis, and . Lisberger, Temporal properties of visual motion signals for the initiation of smooth pursuit eye movements in monkeys, Journal of Neurophysiology, vol.72, issue.1, pp.150-162, 1994.

C. Kühnel, T. Westermann, F. Hemmert, S. Kratz, A. Müller et al., I'm home : Defining and evaluating a gesture set for smart-home control, Int. Journal of Human-Computer Studies, vol.69, issue.11, p.44, 2011.

C. Kumar, R. Menges, and S. Staab, Eye-controlled interfaces for multimedia interaction, IEEE MultiMedia, vol.23, issue.4, p.45, 2016.

M. Kumar, T. Winograd, T. Winograd, and A. Paepcke, Gazeenhanced scrolling techniques, CHI'07 Extended Abstracts on Human Factors in Computing Systems, vol.45, p.100, 2007.

A. Kwan, 6 benefits of music lessons, p.65, 2018.

. Bibliographie,

W. Hubert and . Lilliefors, On the Kolmogorov-Smirnov test for normality with mean and variance unknown, Journal of the American statistical Association, vol.62, issue.318, p.84, 1967.

G. David and . Lowe, Distinctive image features from scale-invariant keypoints, Int. journal of computer vision, vol.60, issue.2, p.20, 2004.

F. Jane, N. H. Mackworth, and . Mackworth, Eye fixations recorded on changing visual scenes by the television eye-marker, JOSA, vol.48, issue.7, p.5, 1958.

T. Paul-p-maglio, . Matlock, S. Christopher, S. Campbell, B. Zhai et al., Gaze and speech in attentive user interfaces, Advances in Multimodal Interfaces ICMI 2000, p.19, 2000.

P. Majaranta and K. Räihä, Twenty years of eye typing : systems and design issues, Proc. of the 2002 Symp. on Eye tracking research & applications, vol.15, p.27, 2002.

D. Mardanbegi, D. W. Hansen, and T. Pederson, Eye-based head gestures, Proc. of the Symp. on eye tracking research and applications, vol.50, p.55, 2012.

M. Mauderer, F. Daiber, and A. Krüger, Combining touch and gaze for distant selection in a tabletop setting, Proc. of the Workshop on Gaze Interaction in the Post-WIMP World -ACM SIGCHI Conf. on Human Factors in Computing Systems. ACM, p.38, 2013.

A. Mehta and M. Bhatt, Practical issues in the field of optical music recognition, IJARCSMS, vol.2, p.67, 2014.

S. Milekic, The more you look the more you get : Intention-based interface using gaze-tracking. Museums and the Web, Selected Papers from an Int. Conf., Archives Museum Informatics, p.31, 2002.

E. Mollenbach, J. Paulin-hansen, and M. Lillholm, Eye movements in gaze interaction, Journal of Eye Movement Research, vol.6, issue.2, p.36, 2013.

L. , P. Morency, and T. Darrell, Head gesture recognition in intelligent interfaces : the role of context in improving recognition, Proc. of Bibliographie the 11th Int. Conf. on Intelligent user interfaces, p.43, 2006.

S. Mori, H. Nishida, and H. Yamada, Optical character recognition, vol.25, p.105, 1999.

T. Nukarinen, J. Kangas, O. ?pakov, P. Isokoski, D. Akkil et al., Evaluation of headturn : An interaction technique using the gaze and head turns, Proc. of the 9th Nordic Conf. on Human-Computer Interaction, vol.50, p.54, 2016.

I. Oakley and S. Modhrain, Tilt to scroll : Evaluating a motion based vibrotactile mobile interface, Eurohaptics Conf. and Symp. on Haptic Interfaces for Virtual Environment and Teleoperator Systems, pp.40-49

, IEEE, p.40, 2005.

I. Oakley and J. Park, Designing eyes-free interaction, Int. Workshop on Haptic and Audio Interaction Design, p.41, 2007.

A. Oh, H. Fox, M. Van-kleek, A. Adler, and K. Gajos, Evaluating look-to-talk : a gaze-aware interface in a collaborative environment, CHI'02 Extended Abstracts on Human Factors in Computing Systems, pp.650-651, 2002.

T. Ohno, Features of eye gaze interface for selection tasks, Proc. of the 3rd Asia Pacific Conf. on Computer Human Interaction, pp.176-181

, IEEE, p.28, 1998.

C. Patel, A. Patel, and D. Patel, Optical character recognition by open source OCR tool tesseract : A case study, Int. Journal of Computer Applications, vol.55, issue.10, p.105, 2012.

M. Porta, S. Ricotti, and C. Perez, Emotional e-learning through eye tracking, Proc. of the 2012 IEEE Global Engineering Education Conf. (EDUCON), pp.1-6, 2012.
DOI : 10.1109/educon.2012.6201145

. Bibliographie,

M. Porta and M. Turina, Eye-s : a full-screen input modality for pure eye-based communication, Proc. of the 2008 Symp. on Eye tracking research & applications, p.31, 2008.

M. Rahman, S. Gustafson, P. Irani, and S. Subramanian, Tilt techniques : investigating the dexterity of wrist-based input, Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, p.41, 2009.

. C1-rashbass, The relationship between saccadic and smooth tracking eye movements, The Journal of Physiology, vol.159, issue.2, pp.326-338, 1961.

K. Rayner, Eye movements in reading and information processing : 20 years of research, Psychological bulletin, vol.124, issue.3, p.66, 1998.

A. Rebelo, I. Fujinaga, F. Paszkiewicz, R. S. Andre, C. Marcal et al., Optical music recognition : state-of-the-art and open issues, Int. Journal of Multimedia Information Retrieval, vol.1, issue.3, p.67, 2012.

A. Reetz, C. Gutwin, T. Stach, M. Nacenta, and S. Subramanian, Superflick : a natural and efficient technique for long-distance object placement on digital tables, Proc. of Graphics interface, p.38, 2006.

K. Ruhland, S. Andrist, J. Badler, C. Peters, N. Badler et al., Look me in the eyes : A survey of eye and gaze animation for virtual agents and artificial systems, Eurographics state-of-the-art report, pp.69-91, 2014.
URL : https://hal.archives-ouvertes.fr/hal-01025241

J. Ruminski, A. Bujnowski, J. Wtorek, A. Andrushevich, M. Biallas et al., Interactions with recognized objects, 7th Int. Conf. on Human System Interactions (HSI), pp.101-105, 2014.

, IEEE, 1921.

J. Schöning, F. Daiber, A. Krüger, and M. Rohs, Using hands and feet to navigate and manipulate spatial data, CHI'09 Extended Bibliographie Abstracts on Human Factors in Computing Systems, p.42, 2009.

J. Scott, D. Dearman, K. Yatani, and K. Truong, Sensing foot gestures from the pocket, Proc. of the 23nd annual ACM Symp. on User interface software and technology, p.42, 2010.

S. Jeffrey, T. Shell, R. Selker, and . Vertegaal, Interacting with groups of computers, Communications of the ACM, vol.46, issue.3, p.19, 2003.

F. Shi, A. G. Gale, and K. Purdy, Direct gaze based environmental controls, The 2nd Conf. on Communication by Gaze Interaction, vol.37, p.38, 2006.

F. Shi, A. G. Gale, and K. Purdy, Eye-centric ICT control, Contemporary Ergonomics : Proc. of the Ergonomics Society Annual Conf, p.20, 2006.

E. Linda, R. Sibert, and . Jacob, Evaluation of eye gaze interaction, Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, p.17, 2000.

O. ?pakov and P. Majaranta, Enhanced gaze interaction using simple head gestures, Proc. of the 2012 ACM Conf. on Ubiquitous Computing, vol.50, p.54, 2012.

M. Dave, . Stampe, and . Reingold, Selection by looking : A novel computer interface and its application to psychological research, Studies in visual information processing, vol.6, p.17, 1995.

S. Stellmach, S. Stober, A. Nürnberger, and R. Dachselt, Designing gaze-supported multimodal interactions for the exploration of large image collections, Proc. of the 1st Conf. on novel gazecontrolled applications, vol.18, p.37, 2011.

T. Taketomi, H. Uchiyama, and S. Ikeda, Visual SLAM algorithms : A survey from 2010 to 2016, IPSJ Transactions on Computer Vision and Applications, vol.9, issue.1, p.108, 2017.

. Bibliographie,

M. Tall, Neovisus : Gaze driven interface components, Proc. of the, p.4

, Conf. on Communication by Gaze Interaction (COGAIN 2008), p.28, 2008.

D. Tan and A. Nijholt, Brain-computer interfaces and humancomputer interaction, Brain-Computer Interfaces, p.37, 2010.

. Jh-ten-kate, E. E. Edward, W. Frietman, . Willems, E. Bm-ter-haar-romeny et al., Eye-switch controlled communication aids, Proc. of the, p.12

, Int. Conf. on Medical & Biological Engineering, p.16, 1979.

H. M. Tong and . Fisher, Progress report on an eye-slaved area-of-interest visual display, p.45, 1984.

T. Toyama, T. Kieninger, F. Shafait, and A. Dengel, Museum guide 2.0-an eye-tracking based personal assistant for museums and exhibits, Proc. of Int. Conf. on Re-Thinking Technology in Museums, vol.1, p.24, 2011.

K. Tsukada and M. Yasumura, Ubi-finger : A simple gesture input device for mobile and ubiquitous environment, Journal of Asian Information, vol.2, issue.2, p.41, 2004.

J. Turner, A. Bulling, and H. Gellersen, Combining gaze with manual interaction to extend physical reach, Proc. of the 1st Int. workshop on pervasive eye tracking & mobile eye-based interaction, vol.37, p.38, 2011.

R. Vatavu, A comparative study of user-defined handheld vs. freehand gestures for home entertainment environments, Journal of Ambient Intelligence and Smart Environments, vol.5, issue.2, p.44, 2013.

E. Velloso, J. Turner, J. Alexander, A. Bulling, and H. Gellersen, An empirical investigation of gaze selection in mid-air gestural 3D manipulation, Human-Computer Interaction, vol.37, p.43, 2015.
URL : https://hal.archives-ouvertes.fr/hal-01599875

. Bibliographie,

E. Velloso, M. Wirth, C. Weichel, A. Esteves, and H. Gellersen, Ambigaze : Direct control of ambient devices by gaze, Proc. of the 2016 ACM Conf. on Designing Interactive Systems, p.35, 2016.

R. Vertegaal, A. Mamuji, C. Sohn, and D. Cheng, Media eyepliances : using eye tracking for remote control focus selection of appliances, CHI'05 Extended Abstracts on Human Factors in Computing Systems, pp.1861-1864, 2005.

M. Vidal, K. Pfeuffer, A. Bulling, and . Hans-w-gellersen, Pursuits : eye-based interaction with moving targets, CHI'13 Extended Abstracts on Human Factors in Computing Systems, p.33, 2013.

C. Ware, H. Harutune, and . Mikaelian, An evaluation of an eye tracker as a device for computer input2, ACM SIGCHI bulletin, vol.17, p.17, 1987.

O. Jacob, B. A. Wobbrock, J. A. Myers, and . Kembel, Edgewrite : a stylus-based text entry method designed for high accuracy and stability of motion, Proc. of the 16th annual ACM Symp. on User interface software and technology, p.30, 2003.

O. Jacob, J. Wobbrock, . Rubinstein, W. Michael, A. T. Sawyer et al., Longitudinal evaluation of discrete consecutive gaze gestures for text entry, Proc. of the 2008 Symp. on Eye tracking research & applications, vol.15, p.30, 2008.

T. Yamamoto, M. Tsukamoto, and T. Yoshihisa, Foot-step input method for operating information devices while jogging, Int. Symp. on Applications and the Internet, p.41, 2008.

L. Alfred and . Yarbus, Eye movements and vision, 1967.

R. Laurence, D. Young, and . Sheena, Survey of eye movement recording methods, Behavior research methods & instrumentation, vol.7, pp.397-429, 1975.

. Bibliographie,

S. Zhai, C. Morimoto, and S. Ihde, Manual and gaze input cascaded (magic) pointing, Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, vol.27, p.37, 1999.

Y. Zhang, A. Bulling, and H. Gellersen, Sideways : a gaze interface for spontaneous interaction with situated displays, Proc. of the SIGCHI Conf. on Human Factors in Computing Systems, pp.851-860

Y. Zhou, T. Xu, B. David, and R. Chalon, Innovative wearable interfaces : an exploratory analysis of paper-based interfaces with cameraglasses device unit. Personal and ubiquitous computing, vol.18, p.11, 2014.
URL : https://hal.archives-ouvertes.fr/hal-01267053