Dissecting abstract, modality-specific and experience-dependent coding of affect in the human brain

Sci Adv. 2024 Mar 8;10(10):eadk6840. doi: 10.1126/sciadv.adk6840. Epub 2024 Mar 8.

Abstract

Emotion and perception are tightly intertwined, as affective experiences often arise from the appraisal of sensory information. Nonetheless, whether the brain encodes emotional instances using a sensory-specific code or in a more abstract manner is unclear. Here, we answer this question by measuring the association between emotion ratings collected during a unisensory or multisensory presentation of a full-length movie and brain activity recorded in typically developed, congenitally blind and congenitally deaf participants. Emotional instances are encoded in a vast network encompassing sensory, prefrontal, and temporal cortices. Within this network, the ventromedial prefrontal cortex stores a categorical representation of emotion independent of modality and previous sensory experience, and the posterior superior temporal cortex maps the valence dimension using an abstract code. Sensory experience more than modality affects how the brain organizes emotional information outside supramodal regions, suggesting the existence of a scaffold for the representation of emotional states where sensory inputs during development shape its functioning.

MeSH terms

  • Brain Mapping / methods
  • Brain*
  • Emotions*
  • Humans
  • Magnetic Resonance Imaging
  • Photic Stimulation
  • Prefrontal Cortex