Haptic and visual information speed up the neural processing of auditory speech in live dyadic interactions

Neuropsychologia. 2014 May:57:71-7. doi: 10.1016/j.neuropsychologia.2014.02.004. Epub 2014 Feb 11.

Abstract

Speech can be perceived not only by the ear and by the eye but also by the hand, with speech gestures felt from manual tactile contact with the speaker׳s face. In the present electro-encephalographic study, early cross-modal interactions were investigated by comparing auditory evoked potentials during auditory, audio-visual and audio-haptic speech perception in dyadic interactions between a listener and a speaker. In line with previous studies, early auditory evoked responses were attenuated and speeded up during audio-visual compared to auditory speech perception. Crucially, shortened latencies of early auditory evoked potentials were also observed during audio-haptic speech perception. Altogether, these results suggest early bimodal interactions during live face-to-face and hand-to-face speech perception in dyadic interactions.

Keywords: Audio–haptic speech perception; Audio–visual speech perception; EEG; Multisensory interactions.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acoustic Stimulation
  • Acoustics
  • Adult
  • Analysis of Variance
  • Brain / physiology*
  • Brain Mapping*
  • Electroencephalography
  • Evoked Potentials / physiology
  • Female
  • Humans
  • Male
  • Middle Aged
  • Photic Stimulation
  • Reaction Time
  • Speech Perception / physiology*
  • Touch Perception / physiology*
  • Touch*
  • Visual Perception / physiology*
  • Young Adult