Mismatch negativity (MMN) to speech sounds is modulated systematically by manual grip execution

Neurosci Lett. 2017 Jun 9:651:237-241. doi: 10.1016/j.neulet.2017.05.024. Epub 2017 May 11.

Abstract

Manual actions and speech are connected: for example, grip execution can influence simultaneous vocalizations and vice versa. Our previous studies show that the consonant [k] is associated with the power grip and the consonant [t] with the precision grip. Here we studied whether the interaction between speech sounds and grips could operate already at a pre-attentive stage of auditory processing, reflected by the mismatch-negativity (MMN) component of the event-related potential (ERP). Participants executed power and precision grips according to visual cues while listening to syllable sequences consisting of [ke] and [te] utterances. The grips modulated the MMN amplitudes to these syllables in a systematic manner so that when the deviant was [ke], the MMN response was larger with a precision grip than with a power grip. There was a converse trend when the deviant was [te]. These results suggest that manual gestures and speech can interact already at a pre-attentive processing level of auditory perception, and show, for the first time that manual actions can systematically modulate the MMN.

Keywords: Action; Action-perception; Gestures; MMN; Speech.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Adult
  • Cerebral Cortex / physiology*
  • Electroencephalography
  • Evoked Potentials*
  • Female
  • Hand Strength*
  • Humans
  • Male
  • Phonetics*
  • Psychomotor Performance*
  • Speech Perception / physiology*
  • Young Adult