A Multi-Modal Person Perception Framework for Socially Interactive Mobile Service Robots

Sensors (Basel). 2020 Jan 28;20(3):722. doi: 10.3390/s20030722.

Abstract

In order to meet the increasing demands of mobile service robot applications, a dedicated perception module is an essential requirement for the interaction with users in real-world scenarios. In particular, multi sensor fusion and human re-identification are recognized as active research fronts. Through this paper we contribute to the topic and present a modular detection and tracking system that models position and additional properties of persons in the surroundings of a mobile robot. The proposed system introduces a probability-based data association method that besides the position can incorporate face and color-based appearance features in order to realize a re-identification of persons when tracking gets interrupted. The system combines the results of various state-of-the-art image-based detection systems for person recognition, person identification and attribute estimation. This allows a stable estimate of a mobile robot's user, even in complex, cluttered environments with long-lasting occlusions. In our benchmark, we introduce a new measure for tracking consistency and show the improvements when face and appearance-based re-identification are combined. The tracking system was applied in a real world application with a mobile rehabilitation assistant robot in a public hospital. The estimated states of persons are used for the user-centered navigation behaviors, e.g., guiding or approaching a person, but also for realizing a socially acceptable navigation in public environments.

Keywords: multi modal person tracking; sensor fusion; user centered robot navigation.

MeSH terms

  • Benchmarking
  • Color
  • Exercise
  • Face
  • Hospitals, Public
  • Humans
  • Image Processing, Computer-Assisted / methods*
  • Normal Distribution
  • Posture
  • Probability
  • Rehabilitation / instrumentation*
  • Robotics / instrumentation*
  • Robotics / methods*
  • User-Computer Interface
  • Walking