Skip to Main content Skip to Navigation
Theses

Augmented reality environments for the interactive exploration of 3D data

Xiyao Wang 1, 2
2 AVIZ - Analysis and Visualization
Inria Saclay - Ile de France, LRI - Laboratoire de Recherche en Informatique
Abstract : Exploratory visualization of 3D data is fundamental in many scientific domains. Traditionally, experts use a PC workstation and rely on mouse and keyboard to interactively adjust the view to observe the data. This setup provides immersion through interaction---users can precisely control the view and the parameters, but it does not provide any depth clues which can limit the comprehension of large and complex 3D data. Virtual or augmented reality (V/AR) setups, in contrast, provide visual immersion with stereoscopic views. Although their benefits have been proven, several limitations restrict their application to existing workflows, including high setup/maintenance needs, difficulties of precise control, and, more importantly, the separation from traditional analysis tools. To benefit from both sides, we thus investigated a hybrid setting combining an AR environment with a traditional PC to provide both interactive and visual immersions for 3D data exploration. We closely collaborated with particle physicists to understand their general working process and visualization requirements to motivate our design. First, building on our observations and discussions with physicists, we built up a prototype that supports fundamental tasks for exploring their datasets. This prototype treated the AR space as an extension to the PC screen and allowed users to freely interact with each using the mouse. Thus, experts could benefit from the visual immersion while using analysis tools on the PC. An observational study with 7 physicists in CERN validated the feasibility of such a hybrid setting, and confirmed the benefits. We also found that the large canvas of the AR and walking around to observe the data in AR had a great potential for data exploration. However, the design of mouse interaction in AR and the use of PC widgets in AR needed improvements. Second, based on the results of the first study, we decided against intensively using flat widgets in AR. But we wondered if using the mouse for navigating in AR is problematic compared to high degrees of freedom (DOFs) input, and then attempted to investigate if the match or mismatch of dimensionality between input and output devices play an important role in users’ performance. Results of user studies (that compared the performance of using mouse, space mouse, and tangible tablet paired with the screen or the AR space) did not show that the (mis-)match was important. We thus concluded that the dimensionality was not a critical point to consider, which suggested that users are free to choose any input that is suitable for a specific task. Moreover, our results suggested that the mouse was still an efficient tool compared to high DOFs input. We can therefore validate our design of keeping the mouse as the primary input for the hybrid setting, while other modalities should only serve as an addition for specific use cases. Next, to support the interaction and to keep the background information while users are walking around to observe the data in AR, we proposed to add a mobile device. We introduced a novel approach that augments tactile interaction with pressure sensing for 3D object manipulation/view navigation. Results showed that this method could efficiently improve the accuracy, with limited influence on completion time. We thus believe that it is useful for visualization purposes where a high accuracy is usually demanded. Finally, we summed up in this thesis all the findings we have and came up with an envisioned setup for a realistic data exploration scenario that makes use of a PC workstation, an AR headset, and a mobile device. The work presented in this thesis shows the potential of combining a PC workstation with AR environments to improve the process of 3D data exploration and confirms its feasibility, all of which will hopefully inspire future designs that seamlessly bring immersive visualization to existing scientific workflows.
Document type :
Theses
Complete list of metadata

https://tel.archives-ouvertes.fr/tel-03170289
Contributor : Abes Star :  Contact
Submitted on : Tuesday, March 16, 2021 - 9:41:14 AM
Last modification on : Wednesday, April 14, 2021 - 3:40:04 AM

File

94693_WANG_2020_archivage.pdf
Version validated by the jury (STAR)

Identifiers

  • HAL Id : tel-03170289, version 1

Citation

Xiyao Wang. Augmented reality environments for the interactive exploration of 3D data. Human-Computer Interaction [cs.HC]. Université Paris-Saclay, 2020. English. ⟨NNT : 2020UPASG052⟩. ⟨tel-03170289⟩

Share

Metrics

Record views

89

Files downloads

35