Task-Dependent Visual Behavior in Immersive Environments: A Comparative Study of Free Exploration, Memory and Visual Search

IEEE Trans Vis Comput Graph. 2023 Nov;29(11):4417-4425. doi: 10.1109/TVCG.2023.3320259. Epub 2023 Nov 2.

Abstract

Visual behavior depends on both bottom-up mechanisms, where gaze is driven by the visual conspicuity of the stimuli, and top-down mechanisms, guiding attention towards relevant areas based on the task or goal of the viewer. While this is well-known, visual attention models often focus on bottom-up mechanisms. Existing works have analyzed the effect of high-level cognitive tasks like memory or visual search on visual behavior; however, they have often done so with different stimuli, methodology, metrics and participants, which makes drawing conclusions and comparisons between tasks particularly difficult. In this work we present a systematic study of how different cognitive tasks affect visual behavior in a novel within-subjects design scheme. Participants performed free exploration, memory and visual search tasks in three different scenes while their eye and head movements were being recorded. We found significant, consistent differences between tasks in the distributions of fixations, saccades and head movements. Our findings can provide insights for practitioners and content creators designing task-oriented immersive applications.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Computer Graphics*
  • Eye Movements*
  • Fixation, Ocular
  • Head Movements
  • Humans
  • Saccades
  • Visual Perception