Human Collaborative Localization and Mapping in Indoor Environments with Non-Continuous Stereo

Sensors (Basel). 2016 Feb 24;16(3):275. doi: 10.3390/s16030275.

Abstract

A new approach to the monocular simultaneous localization and mapping (SLAM) problem is presented in this work. Data obtained from additional bearing-only sensors deployed as wearable devices is fully fused into an Extended Kalman Filter (EKF). The wearable device is introduced in the context of a collaborative task within a human-robot interaction (HRI) paradigm, including the SLAM problem. Thus, based on the delayed inverse-depth feature initialization (DI-D) SLAM, data from the camera deployed on the human, capturing his/her field of view, is used to enhance the depth estimation of the robotic monocular sensor which maps and locates the device. The occurrence of overlapping between the views of both cameras is predicted through geometrical modelling, activating a pseudo-stereo methodology which allows to instantly measure the depth by stochastic triangulation of matched points found through SIFT/SURF. Experimental validation is provided through results from experiments, where real data is captured as synchronized sequences of video and other data (relative pose of secondary camera) and processed off-line. The sequences capture indoor trajectories representing the main challenges for a monocular SLAM approach, namely, singular trajectories and close turns with high angular velocities with respect to linear velocities.

Keywords: HRI; collaborative robotics; indoor mapping; monocular SLAM.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Artificial Intelligence*
  • Humans
  • Imaging, Three-Dimensional / methods*
  • Photography / methods*
  • Robotics*