Feasibility of Tracking Human Kinematics with Simultaneous Localization and Mapping (SLAM)

Sensors (Basel). 2022 Dec 1;22(23):9378. doi: 10.3390/s22239378.

Abstract

We evaluated a new wearable technology that fuses inertial sensors and cameras for tracking human kinematics. These devices use on-board simultaneous localization and mapping (SLAM) algorithms to localize the camera within the environment. Significance of this technology is in its potential to overcome many of the limitations of the other dominant technologies. Our results demonstrate this system often attains an estimated orientation error of less than 1° and a position error of less than 4 cm as compared to a robotic arm. This demonstrates that SLAM's accuracy is adequate for many practical applications for tracking human kinematics.

Keywords: computer vision; kinematics; motion capture; simultaneous localization and mapping; wearable cameras.

MeSH terms

  • Algorithms*
  • Biomechanical Phenomena
  • Humans

Grants and funding

This research received no external funding.