Multimodal emotion recognition by combining physiological signals and facial expressions: a preliminary study

Annu Int Conf IEEE Eng Med Biol Soc. 2012:2012:5238-41. doi: 10.1109/EMBC.2012.6347175.

Abstract

Lately, multimodal approaches for automatic emotion recognition have gained significant scientific interest. In this paper, emotion recognition by combining physiological signals and facial expressions was studied. Heart rate variability parameters, respiration frequency, and facial expressions were used to classify person's emotions while watching pictures with emotional content. Three classes were used for both valence and arousal. The preliminary results show that, over the proposed channels, detecting arousal seem to be easier compared to valence. While the classification performance of 54.5% was attained with arousal, only 38.0% of the samples were classified correctly in terms of valence. In future, additional modalities as well as feature selection will be utilized to improve the results.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Electrocardiography / methods*
  • Emotions / physiology*
  • Facial Expression*
  • Heart Rate / physiology*
  • Humans
  • Image Interpretation, Computer-Assisted / methods*
  • Pattern Recognition, Automated / methods*
  • Pilot Projects
  • Psychometrics / methods*
  • Reproducibility of Results
  • Sensitivity and Specificity