Comparing Non-Visual and Visual Guidance Methods for Narrow Field of View Augmented Reality Displays

IEEE Trans Vis Comput Graph. 2020 Dec;26(12):3389-3401. doi: 10.1109/TVCG.2020.3023605. Epub 2020 Nov 10.

Abstract

Current augmented reality displays still have a very limited field of view compared to the human vision. In order to localize out-of-view objects, researchers have predominantly explored visual guidance approaches to visualize information in the limited (in-view) screen space. Unfortunately, visual conflicts like cluttering or occlusion of information often arise, which can lead to search performance issues and a decreased awareness about the physical environment. In this paper, we compare an innovative non-visual guidance approach based on audio-tactile cues with the state-of-the-art visual guidance technique EyeSee360 for localizing out-of-view objects in augmented reality displays with limited field of view. In our user study, we evaluate both guidance methods in terms of search performance and situation awareness. We show that although audio-tactile guidance is generally slower than the well-performing EyeSee360 in terms of search times, it is on a par regarding the hit rate. Even more so, the audio-tactile method provides a significant improvement in situation awareness compared to the visual approach.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Augmented Reality*
  • Awareness / physiology
  • Computer Graphics
  • Cues*
  • Equipment Design
  • Female
  • Humans
  • Male
  • Middle Aged
  • Task Performance and Analysis
  • User-Computer Interface
  • Virtual Reality*
  • Young Adult