Identification of Language-Induced Mental Load from Eye Behaviors in Virtual Reality

Sensors (Basel). 2023 Jul 25;23(15):6667. doi: 10.3390/s23156667.

Abstract

Experiences of virtual reality (VR) can easily break if the method of evaluating subjective user states is intrusive. Behavioral measures are increasingly used to avoid this problem. One such measure is eye tracking, which recently became more standard in VR and is often used for content-dependent analyses. This research is an endeavor to utilize content-independent eye metrics, such as pupil size and blinks, for identifying mental load in VR users. We generated mental load independently from visuals through auditory stimuli. We also defined and measured a new eye metric, focus offset, which seeks to measure the phenomenon of "staring into the distance" without focusing on a specific surface. In the experiment, VR-experienced participants listened to two native and two foreign language stimuli inside a virtual phone booth. The results show that with increasing mental load, relative pupil size on average increased 0.512 SDs (0.118 mm), with 57% reduced variance. To a lesser extent, mental load led to fewer fixations, less voluntary gazing at distracting content, and a larger focus offset as if looking through surfaces (about 0.343 SDs, 5.10 cm). These results are in agreement with previous studies. Overall, we encourage further research on content-independent eye metrics, and we hope that hardware and algorithms will be developed in the future to further increase tracking stability.

Keywords: cognitive load; eye tracking; flow state; language task; listening comprehension; mental load; virtual reality.

MeSH terms

  • Auditory Perception
  • Humans
  • Language
  • Surveys and Questionnaires
  • User-Computer Interface
  • Virtual Reality*

Grants and funding

This research received no external funding.