An investigation on the feasibility of uncalibrated and unconstrained gaze tracking for human assistive applications by using head pose estimation

Sensors (Basel). 2014 May 12;14(5):8363-79. doi: 10.3390/s140508363.

Abstract

This paper investigates the possibility of accurately detecting and tracking human gaze by using an unconstrained and noninvasive approach based on the head pose information extracted by an RGB-D device. The main advantages of the proposed solution are that it can operate in a totally unconstrained environment, it does not require any initial calibration and it can work in real-time. These features make it suitable for being used to assist human in everyday life (e.g., remote device control) or in specific actions (e.g., rehabilitation), and in general in all those applications where it is not possible to ask for user cooperation (e.g., when users with neurological impairments are involved). To evaluate gaze estimation accuracy, the proposed approach has been largely tested and results are then compared with the leading methods in the state of the art, which, in general, make use of strong constraints on the people movements, invasive/additional hardware and supervised pattern recognition modules. Experimental tests demonstrated that, in most cases, the errors in gaze estimation are comparable to the state of the art methods, although it works without additional constraints, calibration and supervised learning.

Publication types

  • Evaluation Study

MeSH terms

  • Colorimetry / instrumentation
  • Colorimetry / methods
  • Equipment Design
  • Equipment Failure Analysis
  • Eye Movements / physiology*
  • Feasibility Studies
  • Fixation, Ocular / physiology*
  • Head Movements / physiology*
  • Humans
  • Image Interpretation, Computer-Assisted / instrumentation
  • Image Interpretation, Computer-Assisted / methods*
  • Imaging, Three-Dimensional / instrumentation
  • Imaging, Three-Dimensional / methods*
  • Monitoring, Ambulatory / instrumentation
  • Monitoring, Ambulatory / methods*
  • Pattern Recognition, Automated / methods
  • Posture / physiology
  • Reproducibility of Results
  • Self-Help Devices*
  • Sensitivity and Specificity