DeepLabStream enables closed-loop behavioral experiments using deep learning-based markerless, real-time posture detection

Commun Biol. 2021 Jan 29;4(1):130. doi: 10.1038/s42003-021-01654-9.

Abstract

In general, animal behavior can be described as the neuronal-driven sequence of reoccurring postures through time. Most of the available current technologies focus on offline pose estimation with high spatiotemporal resolution. However, to correlate behavior with neuronal activity it is often necessary to detect and react online to behavioral expressions. Here we present DeepLabStream, a versatile closed-loop tool providing real-time pose estimation to deliver posture dependent stimulations. DeepLabStream has a temporal resolution in the millisecond range, can utilize different input, as well as output devices and can be tailored to multiple experimental designs. We employ DeepLabStream to semi-autonomously run a second-order olfactory conditioning task with freely moving mice and optogenetically label neuronal ensembles active during specific head directions.

Publication types

  • Research Support, Non-U.S. Gov't
  • Video-Audio Media

MeSH terms

  • Animals
  • Behavior, Animal*
  • Conditioning, Classical
  • Deep Learning*
  • Head Movements
  • Image Processing, Computer-Assisted
  • Light
  • Mice
  • Mice, Inbred C57BL
  • Odorants
  • Olfactory Perception
  • Optogenetics*
  • Photic Stimulation
  • Posture*
  • Smell
  • Thalamus / metabolism
  • Thalamus / physiology*
  • Thalamus / radiation effects
  • Time Factors
  • Video Recording*