Building and validation of a set of facial expression images to detect emotions: a transcultural study

Psychol Res. 2022 Sep;86(6):1996-2006. doi: 10.1007/s00426-021-01605-3. Epub 2021 Oct 15.

Abstract

The automatic emotion recognition from facial expressions has become an exceptional tool in research involving human subjects and has made it possible to obtain objective measurements of the emotional state of research subjects. Different software and commercial solutions are offered to perform this task. However, the adaptation to cultural context and the recognition of complex expressions and/or emotions are two of the main challenges faced by these solutions. Here, we describe the construction and validation of a set of facial expression images suitable for training a recognition system. Our datasets consist of images of people with no experience in acting who were recorded with a webcam as they performed a computer-assisted task in a room with a light background and overhead illumination. The six basic emotions and mockery were included and a combination of OpenCV, Dlib and Scikit-learn Python libraries were used to develop a support vector machine classifier. The code is available at GitHub and the images will be provided upon request. Since transcultural facial expressions to evaluate complex emotions and open-source solutions were used in this study, we strongly believe that our dataset will be useful in different research contexts.

MeSH terms

  • Cross-Cultural Comparison*
  • Emotions
  • Facial Expression*
  • Humans
  • Photic Stimulation / methods
  • Recognition, Psychology