Automatically Annotated Dataset of a Ground Mobile Robot in Natural Environments via Gazebo Simulations

Sensors (Basel). 2022 Jul 26;22(15):5599. doi: 10.3390/s22155599.

Abstract

This paper presents a new synthetic dataset obtained from Gazebo simulations of an Unmanned Ground Vehicle (UGV) moving on different natural environments. To this end, a Husky mobile robot equipped with a tridimensional (3D) Light Detection and Ranging (LiDAR) sensor, a stereo camera, a Global Navigation Satellite System (GNSS) receiver, an Inertial Measurement Unit (IMU) and wheel tachometers has followed several paths using the Robot Operating System (ROS). Both points from LiDAR scans and pixels from camera images, have been automatically labeled into their corresponding object class. For this purpose, unique reflectivity values and flat colors have been assigned to each object present in the modeled environments. As a result, a public dataset, which also includes 3D pose ground-truth, is provided as ROS bag files and as human-readable data. Potential applications include supervised learning and benchmarking for UGV navigation on natural environments. Moreover, to allow researchers to easily modify the dataset or to directly use the simulations, the required code has also been released.

Keywords: 3D LiDAR; Gazebo simulator; UGV navigation; automatic data labeling; natural environments; stereo camera; synthetic dataset.

MeSH terms

  • Benchmarking
  • Environment
  • Humans
  • Reactive Oxygen Species
  • Robotics*
  • Software

Substances

  • Reactive Oxygen Species