Electrophysiological Evidence for Top-Down Lexical Influences on Early Speech Perception

Psychol Sci. 2019 Jun;30(6):830-841. doi: 10.1177/0956797619841813. Epub 2019 Apr 24.

Abstract

An unresolved issue in speech perception concerns whether top-down linguistic information influences perceptual responses. We addressed this issue using the event-related-potential technique in two experiments that measured cross-modal sequential-semantic priming effects on the auditory N1, an index of acoustic-cue encoding. Participants heard auditory targets (e.g., "potatoes") following associated visual primes (e.g., "MASHED"), neutral visual primes (e.g., "FACE"), or a visual mask (e.g., "XXXX"). Auditory targets began with voiced (/b/, /d/, /g/) or voiceless (/p/, /t/, /k/) stop consonants, an acoustic difference known to yield differences in N1 amplitude. In Experiment 1 (N = 21), semantic context modulated responses to upcoming targets, with smaller N1 amplitudes for semantic associates. In Experiment 2 (N = 29), semantic context changed how listeners encoded sounds: Ambiguous voice-onset times were encoded similarly to the voicing end point elicited by semantic associates. These results are consistent with an interactive model of spoken-word recognition that includes top-down effects on early perception.

Keywords: event-related potentials; perceptual encoding; semantic priming; speech perception; top-down processing.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Auditory Perception / physiology*
  • Electrophysiological Phenomena
  • Evoked Potentials
  • Female
  • Humans
  • Male
  • Models, Neurological
  • Phonetics
  • Reaction Time
  • Semantics*
  • Speech Perception / physiology*
  • Young Adult