Eye-Tracking Analysis for Emotion Recognition

Comput Intell Neurosci. 2020 Aug 27:2020:2909267. doi: 10.1155/2020/2909267. eCollection 2020.

Abstract

This article reports the results of the study related to emotion recognition by using eye-tracking. Emotions were evoked by presenting a dynamic movie material in the form of 21 video fragments. Eye-tracking signals recorded from 30 participants were used to calculate 18 features associated with eye movements (fixations and saccades) and pupil diameter. To ensure that the features were related to emotions, we investigated the influence of luminance and the dynamics of the presented movies. Three classes of emotions were considered: high arousal and low valence, low arousal and moderate valence, and high arousal and high valence. A maximum of 80% classification accuracy was obtained using the support vector machine (SVM) classifier and leave-one-subject-out validation method.

MeSH terms

  • Arousal
  • Emotions*
  • Eye Movements*
  • Eye-Tracking Technology*
  • Humans
  • Reproducibility of Results
  • Support Vector Machine*