Eye-gaze information input based on pupillary response to visual stimulus with luminance modulation

PLoS One. 2020 Jan 9;15(1):e0226991. doi: 10.1371/journal.pone.0226991. eCollection 2020.

Abstract

This study develops an information-input interface in which a visual stimulus targeted by a user's eye gaze is identified based on the pupillary light reflex to periodic luminance modulations of the object. Experiment 1 examines how pupil size changes in response to periodic luminance modulation of visual stimuli, and the results are used to develop an algorithm for information input. Experiment 2a examines the effectiveness of interfaces with two objects. The results demonstrate that 98% accurate identification of the gaze targeted object is possible if the luminance modulation frequencies of two objects differ by at least 0.12 Hz. Experiment 2b examines the accuracy of a gaze directed information input method based on a keyboard configuration with twelve responses. The results reveal that keyboard input is possible with an average accuracy of 85% for luminance modulation frequencies from 0.75 to 2.75 Hz. The proposed pupillometry based information-input interface offers several advantages, such as low burden on users, minimal invasiveness, no need for training or experience, high theoretical validity, and no need for calibration. Thus, the pupillometry method presented herein has advantages for practical use without requiring the eye's position to be calibrated. Additionally, this method has a potential for the design of interfaces that allow patients with severely limited motor function to communicate with others.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Communication
  • Female
  • Fixation, Ocular / physiology*
  • Humans
  • Light
  • Male
  • Photic Stimulation / methods*
  • Reflex, Pupillary*
  • User-Computer Interface
  • Young Adult

Grants and funding

This research was supported by Adaptable and Seamless Technology transfer Program through Target-driven R&D (A-STEP) from Japan Science and Technology Agency (JST).