The effect of topic familiarity and volatility of auditory scene on selective auditory attention

Hear Res. 2023 Jun:433:108770. doi: 10.1016/j.heares.2023.108770. Epub 2023 Apr 16.

Abstract

Selective auditory attention has been shown to modulate the cortical representation of speech. This effect has been well documented in acoustically more challenging environments. However, the influence of top-down factors, in particular topic familiarity, on this process remains unclear, despite evidence that semantic information can promote speech-in-noise perception. Apart from individual features forming a static listening condition, dynamic and irregular changes of auditory scenes-volatile listening environments-have been less studied. To address these gaps, we explored the influence of topic familiarity and volatile listening on the selective auditory attention process during dichotic listening using electroencephalography. When stories with unfamiliar topics were presented, participants' comprehension was severely degraded. However, their cortical activity selectively tracked the speech of the target story well. This implies that topic familiarity hardly influences the speech tracking neural index, possibly when the bottom-up information is sufficient. However, when the listening environment was volatile and the listeners had to re-engage in new speech whenever auditory scenes altered, the neural correlates of the attended speech were degraded. In particular, the cortical response to the attended speech and the spatial asymmetry of the response to the left and right attention were significantly attenuated around 100-200 ms after the speech onset. These findings suggest that volatile listening environments could adversely affect the modulation effect of selective attention, possibly by hampering proper attention due to increased perceptual load.

Keywords: Auditory attention detection(AAD); Electroencephalography(EEG); Listening volatility; Neural decoding; Selective auditory attention; Topic familiarity.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Attention / physiology
  • Auditory Perception*
  • Electroencephalography
  • Hearing
  • Humans
  • Speech Perception* / physiology