Cortical encoding of acoustic and linguistic rhythms in spoken narratives

Elife. 2020 Dec 21:9:e60433. doi: 10.7554/eLife.60433.

Abstract

Speech contains rich acoustic and linguistic information. Using highly controlled speech materials, previous studies have demonstrated that cortical activity is synchronous to the rhythms of perceived linguistic units, for example, words and phrases, on top of basic acoustic features, for example, the speech envelope. When listening to natural speech, it remains unclear, however, how cortical activity jointly encodes acoustic and linguistic information. Here we investigate the neural encoding of words using electroencephalography and observe neural activity synchronous to multi-syllabic words when participants naturally listen to narratives. An amplitude modulation (AM) cue for word rhythm enhances the word-level response, but the effect is only observed during passive listening. Furthermore, words and the AM cue are encoded by spatially separable neural responses that are differentially modulated by attention. These results suggest that bottom-up acoustic cues and top-down linguistic knowledge separately contribute to cortical encoding of linguistic units in spoken narratives.

Keywords: attention; frequency tagging; human; language; neuroscience; rhythm; speech envelope; spoken narratives.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acoustic Stimulation
  • Adult
  • Cerebral Cortex / physiology*
  • Comprehension / physiology*
  • Electroencephalography
  • Female
  • Humans
  • Language
  • Male
  • Speech Perception / physiology*
  • Young Adult