Visual Analytics for Mobile Eye Tracking

IEEE Trans Vis Comput Graph. 2017 Jan;23(1):301-310. doi: 10.1109/TVCG.2016.2598695.

Abstract

The analysis of eye tracking data often requires the annotation of areas of interest (AOIs) to derive semantic interpretations of human viewing behavior during experiments. This annotation is typically the most time-consuming step of the analysis process. Especially for data from wearable eye tracking glasses, every independently recorded video has to be annotated individually and corresponding AOIs between videos have to be identified. We provide a novel visual analytics approach to ease this annotation process by image-based, automatic clustering of eye tracking data integrated in an interactive labeling and analysis system. The annotation and analysis are tightly coupled by multiple linked views that allow for a direct interpretation of the labeled data in the context of the recorded video stimuli. The components of our analytics environment were developed with a user-centered design approach in close cooperation with an eye tracking expert. We demonstrate our approach with eye tracking data from a real experiment and compare it to an analysis of the data by manual annotation of dynamic AOIs. Furthermore, we conducted an expert user study with 6 external eye tracking researchers to collect feedback and identify analysis strategies they used while working with our application.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Computer Graphics*
  • Eye Movements / physiology*
  • Humans
  • Image Processing, Computer-Assisted / methods*
  • Video Recording*