Acoustic richness modulates the neural networks supporting intelligible speech processing

Hear Res. 2016 Mar:333:108-117. doi: 10.1016/j.heares.2015.12.008. Epub 2015 Dec 23.

Abstract

The information contained in a sensory signal plays a critical role in determining what neural processes are engaged. Here we used interleaved silent steady-state (ISSS) functional magnetic resonance imaging (fMRI) to explore how human listeners cope with different degrees of acoustic richness during auditory sentence comprehension. Twenty-six healthy young adults underwent scanning while hearing sentences that varied in acoustic richness (high vs. low spectral detail) and syntactic complexity (subject-relative vs. object-relative center-embedded clause structures). We manipulated acoustic richness by presenting the stimuli as unprocessed full-spectrum speech, or noise-vocoded with 24 channels. Importantly, although the vocoded sentences were spectrally impoverished, all sentences were highly intelligible. These manipulations allowed us to test how intelligible speech processing was affected by orthogonal linguistic and acoustic demands. Acoustically rich speech showed stronger activation than acoustically less-detailed speech in a bilateral temporoparietal network with more pronounced activity in the right hemisphere. By contrast, listening to sentences with greater syntactic complexity resulted in increased activation of a left-lateralized network including left posterior lateral temporal cortex, left inferior frontal gyrus, and left dorsolateral prefrontal cortex. Significant interactions between acoustic richness and syntactic complexity occurred in left supramarginal gyrus, right superior temporal gyrus, and right inferior frontal gyrus, indicating that the regions recruited for syntactic challenge differed as a function of acoustic properties of the speech. Our findings suggest that the neural systems involved in speech perception are finely tuned to the type of information available, and that reducing the richness of the acoustic signal dramatically alters the brain's response to spoken language, even when intelligibility is high.

Keywords: Acoustic; Executive function; Hearing; Language; Listening effort; Speech; Vocoding; fMRI.

Publication types

  • Research Support, N.I.H., Extramural
  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acoustic Stimulation / methods
  • Acoustics
  • Adult
  • Audiometry, Speech
  • Auditory Pathways / physiology*
  • Brain Mapping / methods
  • Female
  • Humans
  • Magnetic Resonance Imaging
  • Male
  • Nerve Net / physiology*
  • Noise / adverse effects
  • Perceptual Masking
  • Sound Spectrography
  • Speech Acoustics*
  • Speech Intelligibility*
  • Speech Perception*
  • Voice Quality*
  • Young Adult