Temporal decoding of vocal and musical emotions: Same code, different timecourse?

Brain Res. 2020 Aug 15:1741:146887. doi: 10.1016/j.brainres.2020.146887. Epub 2020 May 15.

Abstract

From a baby's cry to a piece of music, we perceive emotions from our auditory environment every day. Many theories bring forward the concept of common neural substrates for the perception of vocal and musical emotions. It has been proposed that, for us to perceive emotions, music recruits emotional circuits that evolved for the processing of biologically relevant vocalizations (e.g., screams, laughs). Although some studies have found similarities between voice and instrumental music in terms of acoustic cues and neural correlates, little is known about their processing timecourse. To further understand how vocal and instrumental emotional sounds are perceived, we used EEG to compare the neural processing timecourse of both stimuli type expressed with a varying degree of complexity (vocal/musical affect bursts and emotion-embedded speech/music). Vocal stimuli in general, as well as musical/vocal bursts, were associated with a more concise sensory trace at initial stages of analysis (smaller N1), although vocal bursts had shorter latencies than the musical ones. As for the P2 - vocal affect bursts and Emotion-Embedded Musical stimuli were associated with earlier P2s. These results support the idea that emotional vocal stimuli are differentiated early from other sources and provide insight into the common neurobiological underpinnings of auditory emotions.

Keywords: ERPs; Emotion; Music; Timecourse; Voice.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acoustic Stimulation / methods*
  • Adult
  • Auditory Perception / physiology*
  • Electroencephalography / methods
  • Emotions / physiology*
  • Female
  • Humans
  • Male
  • Music / psychology*
  • Reaction Time / physiology*
  • Time Factors
  • Voice / physiology*
  • Young Adult