Visual perception of facial expressions of emotion

Curr Opin Psychol. 2017 Oct:17:27-33. doi: 10.1016/j.copsyc.2017.06.009. Epub 2017 Jun 21.

Abstract

Facial expressions of emotion are produced by contracting and relaxing the facial muscles in our face. I hypothesize that the human visual system solves the inverse problem of production, that is, to interpret emotion, the visual system attempts to identify the underlying muscle activations. I show converging computational, behavioral and imaging evidence in favor of this hypothesis. I detail the computations performed by the human visual system to achieve the decoding of these facial actions and identify a brain region where these computations likely take place. The resulting computational model explains how humans readily classify emotions into categories as well as continuous variables. This model also predicts the existence of a large number of previously unknown facial expressions, including compound emotions, affect attributes and mental states that are regularly used by people. I provide evidence in favor of this prediction.

Publication types

  • Review

MeSH terms

  • Brain / diagnostic imaging
  • Brain / physiology*
  • Computer Simulation
  • Emotions* / physiology
  • Facial Expression
  • Facial Recognition / physiology*
  • Humans
  • Models, Neurological
  • Models, Psychological