Adaptive Absolute Ego-Motion Estimation Using Wearable Visual-Inertial Sensors for Indoor Positioning

Micromachines (Basel). 2018 Mar 6;9(3):113. doi: 10.3390/mi9030113.

Abstract

This paper proposes an adaptive absolute ego-motion estimation method using wearable visual-inertial sensors for indoor positioning. We introduce a wearable visual-inertial device to estimate not only the camera ego-motion, but also the 3D motion of the moving object in dynamic environments. Firstly, a novel method dynamic scene segmentation is proposed using two visual geometry constraints with the help of inertial sensors. Moreover, this paper introduces a concept of "virtual camera" to consider the motion area related to each moving object as if a static object were viewed by a "virtual camera". We therefore derive the 3D moving object's motion from the motions for the real and virtual camera because the virtual camera's motion is actually the combined motion of both the real camera and the moving object. In addition, a multi-rate linear Kalman-filter (MR-LKF) as our previous work was selected to solve both the problem of scale ambiguity in monocular camera tracking and the different sampling frequencies of visual and inertial sensors. The performance of the proposed method is evaluated by simulation studies and practical experiments performed in both static and dynamic environments. The results show the method's robustness and effectiveness compared with the results from a Pioneer robot as the ground truth.

Keywords: ego-motion estimation; indoor navigation; monocular camera; scale ambiguity; wearable sensors.