Validation of dynamic virtual faces for facial affect recognition

PLoS One. 2021 Jan 25;16(1):e0246001. doi: 10.1371/journal.pone.0246001. eCollection 2021.

Abstract

The ability to recognise facial emotions is essential for successful social interaction. The most common stimuli used when evaluating this ability are photographs. Although these stimuli have proved to be valid, they do not offer the level of realism that virtual humans have achieved. The objective of the present paper is the validation of a new set of dynamic virtual faces (DVFs) that mimic the six basic emotions plus the neutral expression. The faces are prepared to be observed with low and high dynamism, and from front and side views. For this purpose, 204 healthy participants, stratified by gender, age and education level, were recruited for assessing their facial affect recognition with the set of DVFs. The accuracy in responses was compared with the already validated Penn Emotion Recognition Test (ER-40). The results showed that DVFs were as valid as standardised natural faces for accurately recreating human-like facial expressions. The overall accuracy in the identification of emotions was higher for the DVFs (88.25%) than for the ER-40 faces (82.60%). The percentage of hits of each DVF emotion was high, especially for neutral expression and happiness emotion. No statistically significant differences were discovered regarding gender. Nor were significant differences found between younger adults and adults over 60 years. Moreover, there is an increase of hits for avatar faces showing a greater dynamism, as well as front views of the DVFs compared to their profile presentations. DVFs are as valid as standardised natural faces for accurately recreating human-like facial expressions of emotions.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Aged
  • Emotions / physiology
  • Facial Expression*
  • Facial Recognition / physiology*
  • Female
  • Humans
  • Male
  • Middle Aged
  • Photic Stimulation
  • Reaction Time / physiology
  • Social Interaction*
  • Social Perception*
  • Young Adult

Grants and funding

This work was partially supported by grants from Spanish Ministerio de Ciencia, Innovación y Universidades, Agencia Estatal de Investigación / European Regional Development Funds (PID2020-115220RB-C21 and EQC2019-006063-P), Biomedical Research Networking Center in Mental Health (CIBERSAM) of the Instituto de Salud Carlos III, Instituto de Salud Carlos III (PI16/00359), and Madrid Regional Government (S2017/BMD-3740). The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.