A LiDAR-Camera-Inertial-GNSS Apparatus for 3D Multimodal Dataset Collection in Woodland Scenarios

Sensors (Basel). 2023 Jul 26;23(15):6676. doi: 10.3390/s23156676.

Abstract

Forestry operations have become of great importance for a sustainable environment in the past few decades due to the increasing toll induced by rural abandonment and climate change. Robotics presents a promising solution to this problem; however, gathering the necessary data for developing and testing algorithms can be challenging. This work proposes a portable multi-sensor apparatus to collect relevant data generated by several onboard sensors. The system incorporates Laser Imaging, Detection and Ranging (LiDAR), two stereo depth cameras and a dedicated inertial measurement unit (IMU) to obtain environmental data, which are coupled with an Android app that extracts Global Navigation Satellite System (GNSS) information from a cell phone. Acquired data can then be used for a myriad of perception-based applications, such as localization and mapping, flammable material identification, traversability analysis, path planning and/or semantic segmentation toward (semi-)automated forestry actuation. The modular architecture proposed is built on Robot Operating System (ROS) and Docker to facilitate data collection and the upgradability of the system. We validate the apparatus' effectiveness in collecting datasets and its flexibility by carrying out a case study for Simultaneous Localization and Mapping (SLAM) in a challenging woodland environment, thus allowing us to compare fundamentally different methods with the multimodal system proposed.

Keywords: GNSS; LiDAR; depth cameras; forestry robotics; inertial measurement unit; multi-sensor apparatus; multimodal dataset collection.