Hybrid Orientation Based Human Limbs Motion Tracking Method

Sensors (Basel). 2017 Dec 9;17(12):2857. doi: 10.3390/s17122857.

Abstract

One of the key technologies that lays behind the human-machine interaction and human motion diagnosis is the limbs motion tracking. To make the limbs tracking efficient, it must be able to estimate a precise and unambiguous position of each tracked human joint and resulting body part pose. In recent years, body pose estimation became very popular and broadly available for home users because of easy access to cheap tracking devices. Their robustness can be improved by different tracking modes data fusion. The paper defines the novel approach-orientation based data fusion-instead of dominating in literature position based approach, for two classes of tracking devices: depth sensors (i.e., Microsoft Kinect) and inertial measurement units (IMU). The detailed analysis of their working characteristics allowed to elaborate a new method that let fuse more precisely limbs orientation data from both devices and compensates their imprecisions. The paper presents the series of performed experiments that verified the method's accuracy. This novel approach allowed to outperform the precision of position-based joints tracking, the methods dominating in the literature, of up to 18%.

Keywords: IMU; Microsoft Kinect; data fusion; depth sensor; motion tracking.

MeSH terms

  • Algorithms
  • Extremities
  • Human Body
  • Humans
  • Motion*