Decoding temporal structure in music and speech relies on shared brain resources but elicits different fine-scale spatial patterns

Cereb Cortex. 2011 Jul;21(7):1507-18. doi: 10.1093/cercor/bhq198. Epub 2010 Nov 11.

Abstract

Music and speech are complex sound streams with hierarchical rules of temporal organization that become elaborated over time. Here, we use functional magnetic resonance imaging to measure brain activity patterns in 20 right-handed nonmusicians as they listened to natural and temporally reordered musical and speech stimuli matched for familiarity, emotion, and valence. Heart rate variability and mean respiration rates were simultaneously measured and were found not to differ between musical and speech stimuli. Although the same manipulation of temporal structure elicited brain activation level differences of similar magnitude for both music and speech stimuli, multivariate classification analysis revealed distinct spatial patterns of brain responses in the 2 domains. Distributed neuronal populations that included the inferior frontal cortex, the posterior and anterior superior and middle temporal gyri, and the auditory brainstem classified temporal structure manipulations in music and speech with significant levels of accuracy. While agreeing with previous findings that music and speech processing share neural substrates, this work shows that temporal structure in the 2 domains is encoded differently, highlighting a fundamental dissimilarity in how the same neural resources are deployed.

Publication types

  • Randomized Controlled Trial
  • Research Support, Non-U.S. Gov't
  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Acoustic Stimulation / methods*
  • Auditory Perception / physiology
  • Brain Mapping / methods
  • Brain Stem / physiology*
  • Frontal Lobe / physiology*
  • Music*
  • Speech / physiology
  • Speech Perception / physiology*
  • Temporal Lobe / physiology*
  • Time Factors