Uncertainty visualization of gaze estimation to support operator-controlled calibration

J Eye Mov Res. 2018 Jan 25;10(5):10.16910/jemr.10.5.6. doi: 10.16910/jemr.10.5.6.

Abstract

In this paper, we investigate how visualization assets can support the qualitative evaluation of gaze estimation uncertainty. Although eye tracking data are commonly available, little has been done to visually investigate the uncertainty of recorded gaze information. This paper tries to fill this gap by using innovative uncertainty computation and visualization. Given a gaze processing pipeline, we estimate the location of this gaze position in the world camera. To do so we developed our own gaze data processing which give us access to every stage of the data transformation and thus the uncertainty computation. To validate our gaze estimation pipeline, we designed an experiment with 12 participants and showed that the correction methods we proposed reduced the Mean Angular Error by about 1.32 cm, aggregating all 12 participants' results. The Mean Angular Error is 0.25° (SD=0.15°) after correction of the estimated gaze. Next, to support the qualitative assessment of this data, we provide a map which codes the actual uncertainty in the user point of view.

Keywords: accuracy; accuracy improvement; eye movement; eye tracking; gaze estimation; head movement; smooth pursuit; uncertainty; usability.