A Visual Cortical Network for Deriving Phonological Information from Intelligible Lip Movements

Curr Biol. 2018 May 7;28(9):1453-1459.e3. doi: 10.1016/j.cub.2018.03.044. Epub 2018 Apr 19.

Abstract

Successful lip-reading requires a mapping from visual to phonological information [1]. Recently, visual and motor cortices have been implicated in tracking lip movements (e.g., [2]). It remains unclear, however, whether visuo-phonological mapping occurs already at the level of the visual cortex-that is, whether this structure tracks the acoustic signal in a functionally relevant manner. To elucidate this, we investigated how the cortex tracks (i.e., entrains to) absent acoustic speech signals carried by silent lip movements. Crucially, we contrasted the entrainment to unheard forward (intelligible) and backward (unintelligible) acoustic speech. We observed that the visual cortex exhibited stronger entrainment to the unheard forward acoustic speech envelope compared to the unheard backward acoustic speech envelope. Supporting the notion of a visuo-phonological mapping process, this forward-backward difference of occipital entrainment was not present for actually observed lip movements. Importantly, the respective occipital region received more top-down input, especially from left premotor, primary motor, and somatosensory regions and, to a lesser extent, also from posterior temporal cortex. Strikingly, across participants, the extent of top-down modulation of the visual cortex stemming from these regions partially correlated with the strength of entrainment to absent acoustic forward speech envelope, but not to present forward lip movements. Our findings demonstrate that a distributed cortical network, including key dorsal stream auditory regions [3-5], influences how the visual cortex shows sensitivity to the intelligibility of speech while tracking silent lip movements.

Keywords: dual-stream speech processing; lip-brain coherence; magnetoencephalography; speech entrainment; speech envelope; speech-brain coherence.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acoustic Stimulation
  • Adult
  • Auditory Cortex / physiology
  • Brain Mapping
  • Female
  • Humans
  • Lip
  • Lipreading
  • Magnetoencephalography / methods
  • Male
  • Motor Cortex / physiology
  • Movement
  • Phonetics
  • Speech / physiology*
  • Speech Intelligibility / physiology
  • Speech Perception / physiology*
  • Visual Cortex / physiology*