High trait anxiety enhances optimal integration of auditory and visual threat cues

J Behav Ther Exp Psychiatry. 2022 Mar:74:101693. doi: 10.1016/j.jbtep.2021.101693. Epub 2021 Sep 15.

Abstract

Background: Emotion perception is essential to human interaction and relies on effective integration of emotional cues across sensory modalities. Despite initial evidence for anxiety-related biases in multisensory processing of emotional information, there is no research to date that directly addresses whether the mechanism of multisensory integration is altered by anxiety. Here, we compared audiovisual integration of emotional cues between individuals with low vs. high trait anxiety.

Methods: Participants were 62 young adults who were assessed on their ability to quickly and accurately identify happy, angry and sad emotions from dynamic visual-only, audio-only and audiovisual face and voice displays.

Results: The results revealed that individuals in the high anxiety group were more likely to integrate angry faces and voices in a statistically optimal fashion, as predicted by the Maximum Likelihood Estimation model, compared to low anxiety individuals. This means that high anxiety individuals achieved higher precision in correctly recognising anger from angry audiovisual stimuli compared to angry face or voice-only stimuli, and compared to low anxiety individuals.

Limitations: We tested a higher proportion of females, and although this does reflect the higher prevalence of clinical anxiety among females in the general population, potential sex differences in multisensory mechanisms due to anxiety should be examined in future studies.

Conclusions: Individuals with high trait anxiety have multisensory mechanisms that are especially fine-tuned for processing threat-related emotions. This bias may exhaust capacity for processing of other emotional stimuli and lead to overly negative evaluations of social interactions.

Keywords: Anxiety; Emotion; Maximum likelihood estimation; Multisensory processing; Negative bias; Race model.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Anger
  • Anxiety / psychology
  • Cues*
  • Emotions
  • Facial Expression
  • Female
  • Humans
  • Male
  • Voice*
  • Young Adult