Frequency-specific directed interactions between whole-brain regions during sentence processing using multimodal stimulus

Neurosci Lett. 2023 Aug 24:812:137409. doi: 10.1016/j.neulet.2023.137409. Epub 2023 Jul 23.

Abstract

Neural oscillations subserve a broad range of speech processing and language comprehension functions. Using an electroencephalogram (EEG), we investigated the frequency-specific directed interactions between whole-brain regions while the participants processed Chinese sentences using different modality stimuli (i.e., auditory, visual, and audio-visual). The results indicate that low-frequency responses correspond to the process of information flow aggregation in primary sensory cortices in different modalities. Information flow dominated by high-frequency responses exhibited characteristics of bottom-up flow from left posterior temporal to left frontal regions. The network pattern of top-down information flowing out of the left frontal lobe was presented by the joint dominance of low- and high-frequency rhythms. Overall, our results suggest that the brain may be modality-independent when processing higher-order language information.

Keywords: Audio-visual integration; Brain network; EEG; Granger causality; Neural oscillations.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Brain / physiology
  • Brain Mapping / methods
  • Comprehension* / physiology
  • Frontal Lobe / physiology
  • Humans
  • Language
  • Magnetic Resonance Imaging
  • Speech Perception* / physiology