Embodied Interaction for Data Manipulation Tasks on Wall-sized Displays

Abstract : Large data sets are used acceleratingly in various professional domains, such as medicine and business. This rises challenges in managing and using them, typically including sense-making, searching and classifying. This does not only require advanced algorithms to process the data sets automatically, but also need users' direct interaction to make initial judgment or to correct mistakes from the machine work. This dissertation explores this problem domain and study users' direct interaction with scattered large data sets. Human body is made for interacting with the physical world, from micro scope to very large scales. We can naturally coordinate ourselves to see, hear, touch and move to interact with the environment in various scales. Beyond individual, humans collaborate with each other through communication and coordination. Based on Dourish's definitioncite{2001:AFE:513034}, Embodied Interaction encourages interaction designers to take advantage of users' existing skills in the physical world, when designing the interaction with digital artefacts. I argue that large interactive spaces enable embodied user interaction with data spread over space, by leveraging users' physical abilities such as walking, approaching and orienting. Beyond single users, co-located environments provide multiple users with physical awareness and verbal gestural communication. While single users' physical actions have been augmented to be various input modalities in existing research, the augmentation of between-user resources has been less explored. In this dissertation, I first present an experiment that formally evaluates the advantage of single users performing a data manipulation task on a wall-sized display, comparing to on a desktop computer. It shows that using users' physical movements to navigate in a large data surface, outperforms existing digital navigation techniques on a desktop computer such as Focus+Context. With the same experimental task, I then study the interaction efficiency of collaborative data manipulation with a wall-sized display, in loosely or closely coupled collaboration styles. The experiment measures the effect of providing a Shared Interaction Technique, in which collaborators perform part of an action each to issue a command. The results conclude its benefits in terms of efficiency, user engagement as well as physical fatigue. Finally, I explore the concept of augmenting human-to-human interaction with shared interaction techniques, and illustrate a design space of such techniques for supporting collaborative data manipulation. I report the design, implementation and evaluation of a set of these techniques and discuss the future work.
Document type :
Theses
Complete list of metadatas

Cited literature [113 references]  Display  Hide  Download

https://tel.archives-ouvertes.fr/tel-01264670
Contributor : Abes Star <>
Submitted on : Monday, February 1, 2016 - 2:33:10 PM
Last modification on : Tuesday, April 16, 2019 - 8:57:05 AM
Long-term archiving on : Saturday, November 12, 2016 - 12:02:19 AM

File

73885_LIU_2015_diffusion.pdf
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-01264670, version 2

Citation

Can Liu. Embodied Interaction for Data Manipulation Tasks on Wall-sized Displays. Human-Computer Interaction [cs.HC]. Université Paris-Saclay, 2015. English. ⟨NNT : 2015SACLS207⟩. ⟨tel-01264670v2⟩

Share

Metrics

Record views

715

Files downloads

470