A Tightly Coupled LiDAR-Inertial SLAM for Perceptually Degraded Scenes

Sensors (Basel). 2022 Apr 15;22(8):3063. doi: 10.3390/s22083063.

Abstract

Realizing robust six degrees of freedom (6DOF) state estimation and high-performance simultaneous localization and mapping (SLAM) for perceptually degraded scenes (such as underground tunnels, corridors, and roadways) is a challenge in robotics. To solve these problems, we propose a SLAM algorithm based on tightly coupled LiDAR-IMU fusion, which consists of two parts: front end iterative Kalman filtering and back end pose graph optimization. Firstly, on the front end, an iterative Kalman filter is established to construct a tightly coupled LiDAR-Inertial Odometry (LIO). The state propagation process for the a priori position and attitude of a robot, which uses predictions and observations, increases the accuracy of the attitude and enhances the system robustness. Second, on the back end, we deploy a keyframe selection strategy to meet the real-time requirements of large-scale scenes. Moreover, loop detection and ground constraints are added to the tightly coupled framework, thereby further improving the overall accuracy of the 6DOF state estimation. Finally, the performance of the algorithm is verified using a public dataset and the dataset we collected. The experimental results show that for perceptually degraded scenes, compared with existing LiDAR-SLAM algorithms, our proposed algorithm grants the robot higher accuracy, real-time performance and robustness, effectively reducing the cumulative error of the system and ensuring the global consistency of the constructed maps.

Keywords: IMU; LiDAR; SLAM; perceptually degraded scenes; state estimation.

MeSH terms

  • Algorithms*
  • Robotics* / methods