OL-SLAM: A Robust and Versatile System of Object Localization and SLAM

Sensors (Basel). 2023 Jan 10;23(2):801. doi: 10.3390/s23020801.

Abstract

This paper proposes a real-time, versatile Simultaneous Localization and Mapping (SLAM) and object localization system, which fuses measurements from LiDAR, camera, Inertial Measurement Unit (IMU), and Global Positioning System (GPS). Our system can locate itself in an unknown environment and build a scene map based on which we can also track and obtain the global location of objects of interest. Precisely, our SLAM subsystem consists of the following four parts: LiDAR-inertial odometry, Visual-inertial odometry, GPS-inertial odometry, and global pose graph optimization. The target-tracking and positioning subsystem is developed based on YOLOv4. Benefiting from the use of GPS sensor in the SLAM system, we can obtain the global positioning information of the target; therefore, it can be highly useful in military operations, rescue and disaster relief, and other scenarios.

Keywords: SLAM; multi-sensor fusion; object tracking and localization.

MeSH terms

  • Disasters*

Grants and funding

This research received no external funding.