Spatial representation of multidimensional information in emotional faces revealed by fMRI

Neuroimage. 2024 Apr 15:290:120578. doi: 10.1016/j.neuroimage.2024.120578. Epub 2024 Mar 16.

Abstract

Face perception is a complex process that involves highly specialized procedures and mechanisms. Investigating into face perception can help us better understand how the brain processes fine-grained, multidimensional information. This research aimed to delve deeply into how different dimensions of facial information are represented in specific brain regions or through inter-regional connections via an implicit face recognition task. To capture the representation of various facial information in the brain, we employed support vector machine decoding, functional connectivity, and model-based representational similarity analysis on fMRI data, resulting in the identification of three crucial findings. Firstly, despite the implicit nature of the task, emotions were still represented in the brain, contrasting with all other facial information. Secondly, the connection between the medial amygdala and the parahippocampal gyrus was found to be essential for the representation of facial emotion in implicit tasks. Thirdly, in implicit tasks, arousal representation occurred in the parahippocampal gyrus, while valence depended on the connection between the primary visual cortex and the parahippocampal gyrus. In conclusion, these findings dissociate the neural mechanisms of emotional valence and arousal, revealing the precise spatial patterns of multidimensional information processing in faces.

Keywords: Arousal; Emotion; Face perception; Functional connectivity; RSA; Valence; fMRI.

MeSH terms

  • Brain / diagnostic imaging
  • Brain Mapping / methods
  • Emotions*
  • Facial Expression
  • Humans
  • Magnetic Resonance Imaging*
  • Parahippocampal Gyrus / diagnostic imaging