An event-related potential comparison of facial expression processing between cartoon and real faces

PLoS One. 2019 Jan 10;14(1):e0198868. doi: 10.1371/journal.pone.0198868. eCollection 2019.

Abstract

Faces play important roles in the social lives of humans. Besides real faces, people also encounter numerous cartoon faces in daily life which convey basic emotional states through facial expressions. Using event-related potentials (ERPs), we conducted a facial expression recognition experiment with 17 university students to compare the processing of cartoon faces with that of real faces. This study used face type (real vs. cartoon), emotion valence (happy vs. angry) and participant gender (male vs. female) as independent variables. Reaction time, recognition accuracy, and the amplitudes and latencies of emotion processing-related ERP components such as N170, VPP (vertex positive potential), and LPP (late positive potential) were used as dependent variables. The ERP results revealed that cartoon faces caused larger N170 and VPP amplitudes as well as a briefer N170 latency than did real faces; that real faces induced larger LPP amplitudes than did cartoon faces. In addition, the results showed a significant difference in the brain regions as reflected in a right hemispheric advantage. The behavioral results showed that the reaction times for happy faces were shorter than those for angry faces; that females showed a higher accuracy than did males; and that males showed a higher recognition accuracy for angry faces than happy faces. Due to the sample size, these results may suggestively but not rigorously demonstrate differences in facial expression recognition and neurological processing between cartoon faces and real faces. Cartoon faces showed a higher processing intensity and speed than real faces during the early processing stage. However, more attentional resources were allocated for real faces during the late processing stage.

Publication types

  • Clinical Trial
  • Comparative Study
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Attention / physiology*
  • Emotions / physiology*
  • Evoked Potentials / physiology*
  • Facial Expression*
  • Facial Recognition / physiology*
  • Female
  • Humans
  • Male
  • Reaction Time / physiology*

Grants and funding

This research was supported by grants from National Natural Science Foundation of China [31371058 to Yifang Wang]. The funders had roles in data collection and preparation of the manuscript.