Development of a Universal Validation Protocol and an Open-Source Database for Multi-Contextual Facial Expression Recognition

Sensors (Basel). 2023 Oct 10;23(20):8376. doi: 10.3390/s23208376.

Abstract

Facial expression recognition (FER) poses a complex challenge due to diverse factors such as facial morphology variations, lighting conditions, and cultural nuances in emotion representation. To address these hurdles, specific FER algorithms leverage advanced data analysis for inferring emotional states from facial expressions. In this study, we introduce a universal validation methodology assessing any FER algorithm's performance through a web application where subjects respond to emotive images. We present the labelled data database, FeelPix, generated from facial landmark coordinates during FER algorithm validation. FeelPix is available to train and test generic FER algorithms, accurately identifying users' facial expressions. A testing algorithm classifies emotions based on FeelPix data, ensuring its reliability. Designed as a computationally lightweight solution, it finds applications in online systems. Our contribution improves facial expression recognition, enabling the identification and interpretation of emotions associated with facial expressions, offering profound insights into individuals' emotional reactions. This contribution has implications for healthcare, security, human-computer interaction, and entertainment.

Keywords: affective computing; facial expression recognition; facial landmarks; labelled data database; machine-learning algorithms.

MeSH terms

  • Emotions
  • Face
  • Facial Expression
  • Facial Recognition*
  • Humans
  • Reproducibility of Results

Grants and funding

This research received no external funding.