Social-affective features drive human representations of observed actions

Elife. 2022 May 24:11:e75027. doi: 10.7554/eLife.75027.

Abstract

Humans observe actions performed by others in many different visual and social settings. What features do we extract and attend when we view such complex scenes, and how are they processed in the brain? To answer these questions, we curated two large-scale sets of naturalistic videos of everyday actions and estimated their perceived similarity in two behavioral experiments. We normed and quantified a large range of visual, action-related, and social-affective features across the stimulus sets. Using a cross-validated variance partitioning analysis, we found that social-affective features predicted similarity judgments better than, and independently of, visual and action features in both behavioral experiments. Next, we conducted an electroencephalography experiment, which revealed a sustained correlation between neural responses to videos and their behavioral similarity. Visual, action, and social-affective features predicted neural patterns at early, intermediate, and late stages, respectively, during this behaviorally relevant time window. Together, these findings show that social-affective features are important for perceiving naturalistic actions and are extracted at the final stage of a temporal gradient in the brain.

Keywords: action perception; behavioral similarity; human; neuroscience; temporal dynamics.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Brain Mapping*
  • Brain* / physiology
  • Electroencephalography
  • Humans
  • Judgment / physiology
  • Photic Stimulation
  • Visual Perception / physiology

Grants and funding

The funders had no role in study design, data collection and interpretation, or the decision to submit the work for publication.