Semantically congruent audiovisual integration with modal-based attention accelerates auditory short-term memory retrieval

Atten Percept Psychophys. 2022 Jul;84(5):1625-1634. doi: 10.3758/s13414-021-02437-4. Epub 2022 May 31.

Abstract

Evidence has shown that multisensory integration benefits to unisensory perception performance are asymmetric and that auditory perception performance can receive more multisensory benefits, especially when the attention focus is directed toward a task-irrelevant visual stimulus. At present, whether the benefits of semantically (in)congruent multisensory integration with modal-based attention for subsequent unisensory short-term memory (STM) retrieval are also asymmetric remains unclear. Using a delayed matching-to-sample paradigm, the present study investigated this issue by manipulating the attention focus during multisensory memory encoding. The results revealed that both visual and auditory STM retrieval reaction times were faster under semantically congruent multisensory conditions than under unisensory memory encoding conditions. We suggest that coherent multisensory representation formation might be optimized by restricted multisensory encoding and can be rapidly triggered by subsequent unisensory memory retrieval demands. Crucially, auditory STM retrieval is exclusively accelerated by semantically congruent multisensory memory encoding, indicating that the less effective sensory modality of memory retrieval relies more on the coherent prior formation of a multisensory representation optimized by modal-based attention.

Keywords: Audiovisual integration; Modal-based attention; Semantic congruency; Short-term memory.

MeSH terms

  • Acoustic Stimulation / methods
  • Attention
  • Auditory Perception
  • Humans
  • Memory, Short-Term*
  • Photic Stimulation / methods
  • Reaction Time
  • Visual Perception*