Emognition dataset: emotion recognition with self-reports, facial expressions, and physiology using wearables

Sci Data. 2022 Apr 7;9(1):158. doi: 10.1038/s41597-022-01262-0.

Abstract

The Emognition dataset is dedicated to testing methods for emotion recognition (ER) from physiological responses and facial expressions. We collected data from 43 participants who watched short film clips eliciting nine discrete emotions: amusement, awe, enthusiasm, liking, surprise, anger, disgust, fear, and sadness. Three wearables were used to record physiological data: EEG, BVP (2x), HR, EDA, SKT, ACC (3x), and GYRO (2x); in parallel with the upper-body videos. After each film clip, participants completed two types of self-reports: (1) related to nine discrete emotions and (2) three affective dimensions: valence, arousal, and motivation. The obtained data facilitates various ER approaches, e.g., multimodal ER, EEG- vs. cardiovascular-based ER, discrete to dimensional representation transitions. The technical validation indicated that watching film clips elicited the targeted emotions. It also supported signals' high quality.

Publication types

  • Dataset

MeSH terms

  • Anger
  • Emotions* / physiology
  • Facial Expression*
  • Humans
  • Sadness / psychology
  • Self Report