Decoding spatiotemporal features of emotional body language in social interactions

Sci Rep. 2022 Sep 5;12(1):15088. doi: 10.1038/s41598-022-19267-5.

Abstract

How are emotions perceived through human body language in social interactions? This study used point-light displays of human interactions portraying emotional scenes (1) to examine quantitative intrapersonal kinematic and postural body configurations, (2) to calculate interaction-specific parameters of these interactions, and (3) to analyze how far both contribute to the perception of an emotion category (i.e. anger, sadness, happiness or affection) as well as to the perception of emotional valence. By using ANOVA and classification trees, we investigated emotion-specific differences in the calculated parameters. We further applied representational similarity analyses to determine how perceptual ratings relate to intra- and interpersonal features of the observed scene. Results showed that within an interaction, intrapersonal kinematic cues corresponded to emotion category ratings, whereas postural cues reflected valence ratings. Perception of emotion category was also driven by interpersonal orientation, proxemics, the time spent in the personal space of the counterpart, and the motion-energy balance between interacting people. Furthermore, motion-energy balance and orientation relate to valence ratings. Thus, features of emotional body language are connected with the emotional content of an observed scene and people make use of the observed emotionally expressive body language and interpersonal coordination to infer emotional content of interactions.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Anger
  • Emotions*
  • Facial Expression
  • Happiness
  • Humans
  • Kinesics
  • Social Interaction*