Faces in context: modulation of expression processing by situational information

Soc Neurosci. 2013;8(6):601-20. doi: 10.1080/17470919.2013.834842. Epub 2013 Sep 23.

Abstract

Numerous studies using the event-related potential (ERP) technique have found that emotional expressions modulate ERP components appearing at different post-stimulus onset times and are indicative of different stages of face processing. With the aim of studying the time course of integration of context and facial expression information, we investigated whether these modulations are sensitive to the situational context in which emotional expressions are perceived. Participants were asked to identify the expression of target faces that were presented immediately after reading short sentences that described happy or anger-inducing situations. The main manipulation was the congruency between the emotional content of the sentences and the target expression. Context-independent amplitude modulation of the N170 and N400 components by emotional expression was observed. On the other hand, context effects appeared on a later component (late positive potential, or LPP), with enhanced amplitudes on incongruent trials. These results show that the early stages of face processing where emotional expressions are coded are not sensitive to verbal information about the situation in which they appear. The timing of context congruency effects suggests that integration of facial expression with situational information occurs at a later stage, probably related to the detection of affective congruency.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adolescent
  • Brain / physiology*
  • Electroencephalography
  • Evoked Potentials, Visual / physiology*
  • Face
  • Facial Expression*
  • Female
  • Humans
  • Male
  • Pattern Recognition, Visual / physiology*
  • Photic Stimulation
  • Signal Processing, Computer-Assisted
  • Young Adult