Perception of Human Interaction Based on Motion Trajectories: From Aerial Videos to Decontextualized Animations

Top Cogn Sci. 2018 Jan;10(1):225-241. doi: 10.1111/tops.12313. Epub 2017 Dec 7.

Abstract

People are adept at perceiving interactions from movements of simple shapes, but the underlying mechanism remains unknown. Previous studies have often used object movements defined by experimenters. The present study used aerial videos recorded by drones in a real-life environment to generate decontextualized motion stimuli. Motion trajectories of displayed elements were the only visual input. We measured human judgments of interactiveness between two moving elements and the dynamic change in such judgments over time. A hierarchical model was developed to account for human performance in this task. The model represents interactivity using latent variables and learns the distribution of critical movement features that signal potential interactivity. The model provides a good fit to human judgments and can also be generalized to the original Heider-Simmel animations (1944). The model can also synthesize decontextualized animations with a controlled degree of interactiveness, providing a viable tool for studying animacy and social perception.

Keywords: Action understanding; Decontextualized animation; Hierarchical model; Motion; Social interaction.

Publication types

  • Research Support, Non-U.S. Gov't
  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Adult
  • Female
  • Humans
  • Interpersonal Relations*
  • Male
  • Models, Theoretical*
  • Motion Perception / physiology*
  • Pattern Recognition, Visual / physiology*
  • Psychomotor Performance / physiology*
  • Young Adult