Semantic integration of differently asynchronous audio-visual information in videos of real-world events in cognitive processing: an ERP study

Neurosci Lett. 2011 Jul 1;498(1):84-8. doi: 10.1016/j.neulet.2011.04.068. Epub 2011 May 5.

Abstract

In the real world, some of the auditory and visual information received by the human brain are temporally asynchronous. How is such information integrated in cognitive processing in the brain? In this paper, we aimed to study the semantic integration of differently asynchronous audio-visual information in cognitive processing using ERP (event-related potential) method. Subjects were presented with videos of real world events, in which the auditory and visual information are temporally asynchronous. When the critical action was prior to the sound, sounds incongruous with the preceding critical actions elicited a N400 effect when compared to congruous condition. This result demonstrates that semantic contextual integration indexed by N400 also applies to cognitive processing of multisensory information. In addition, the N400 effect is early in latency when contrasted with other visually induced N400 studies. It is shown that cross modal information is facilitated in time when contrasted with visual information in isolation. When the sound was prior to the critical action, a larger late positive wave was observed under the incongruous condition compared to congruous condition. P600 might represent a reanalysis process, in which the mismatch between the critical action and the preceding sound was evaluated. It is shown that environmental sound may affect the cognitive processing of a visual event.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Acoustic Stimulation
  • Auditory Perception / physiology*
  • Brain / physiology*
  • Brain Mapping*
  • Electroencephalography
  • Evoked Potentials / physiology*
  • Female
  • Humans
  • Male
  • Photic Stimulation
  • Visual Perception / physiology*
  • Young Adult