Markerless Mouse Tracking for Social Experiments

eNeuro. 2024 Feb 27;11(2):ENEURO.0154-22.2023. doi: 10.1523/ENEURO.0154-22.2023. Print 2024 Feb.

Abstract

Automated behavior quantification in socially interacting animals requires accurate tracking. While many methods have been very successful and highly generalizable to different settings, issues of mistaken identities and lost information on key anatomical features are common, although they can be alleviated by increased human effort in training or post-processing. We propose a markerless video-based tool to simultaneously track two interacting mice of the same appearance in controlled settings for quantifying behaviors such as different types of sniffing, touching, and locomotion to improve tracking accuracy under these settings without increased human effort. It incorporates conventional handcrafted tracking and deep-learning-based techniques. The tool is trained on a small number of manually annotated images from a basic experimental setup and outputs body masks and coordinates of the snout and tail-base for each mouse. The method was tested on several commonly used experimental conditions including bedding in the cage and fiberoptic or headstage implants on the mice. Results obtained without any human corrections after the automated analysis showed a near elimination of identities switches and a ∼15% improvement in tracking accuracy over pure deep-learning-based pose estimation tracking approaches. Our approach can be optionally ensembled with such techniques for further improvement. Finally, we demonstrated an application of this approach in studies of social behavior of mice by quantifying and comparing interactions between pairs of mice in which some lack olfaction. Together, these results suggest that our approach could be valuable for studying group behaviors in rodents, such as social interactions.

Keywords: computer vision; deep learning; mouse tracking; social behavior.

MeSH terms

  • Algorithms*
  • Animals
  • Humans
  • Rodentia
  • Social Behavior*