Inside Speech: Multisensory and Modality-specific Processing of Tongue and Lip Speech Actions

J Cogn Neurosci. 2017 Mar;29(3):448-466. doi: 10.1162/jocn_a_01057. Epub 2016 Oct 19.

Abstract

Action recognition has been found to rely not only on sensory brain areas but also partly on the observer's motor system. However, whether distinct auditory and visual experiences of an action modulate sensorimotor activity remains largely unknown. In the present sparse sampling fMRI study, we determined to which extent sensory and motor representations interact during the perception of tongue and lip speech actions. Tongue and lip speech actions were selected because tongue movements of our interlocutor are accessible via their impact on speech acoustics but not visible because of its position inside the vocal tract, whereas lip movements are both "audible" and visible. Participants were presented with auditory, visual, and audiovisual speech actions, with the visual inputs related to either a sagittal view of the tongue movements or a facial view of the lip movements of a speaker, previously recorded by an ultrasound imaging system and a video camera. Although the neural networks involved in visual visuolingual and visuofacial perception largely overlapped, stronger motor and somatosensory activations were observed during visuolingual perception. In contrast, stronger activity was found in auditory and visual cortices during visuofacial perception. Complementing these findings, activity in the left premotor cortex and in visual brain areas was found to correlate with visual recognition scores observed for visuolingual and visuofacial speech stimuli, respectively, whereas visual activity correlated with RTs for both stimuli. These results suggest that unimodal and multimodal processing of lip and tongue speech actions rely on common sensorimotor brain areas. They also suggest that visual processing of audible but not visible movements induces motor and visual mental simulation of the perceived actions to facilitate recognition and/or to learn the association between auditory and visual signals.

MeSH terms

  • Acoustic Stimulation / methods
  • Adolescent
  • Adult
  • Brain / diagnostic imaging
  • Brain / physiology*
  • Brain Mapping
  • Facial Recognition / physiology*
  • Female
  • Humans
  • Magnetic Resonance Imaging
  • Male
  • Motion Perception / physiology*
  • Neuropsychological Tests
  • Photic Stimulation / methods
  • Reaction Time
  • Social Perception
  • Speech Perception / physiology*
  • Young Adult