Joint Calibration of a Multimodal Sensor System for Autonomous Vehicles

Sensors (Basel). 2023 Jun 17;23(12):5676. doi: 10.3390/s23125676.

Abstract

Multimodal sensor systems require precise calibration if they are to be used in the field. Due to the difficulty of obtaining the corresponding features from different modalities, the calibration of such systems is an open problem. We present a systematic approach for calibrating a set of cameras with different modalities (RGB, thermal, polarization, and dual-spectrum near infrared) with regard to a LiDAR sensor using a planar calibration target. Firstly, a method for calibrating a single camera with regard to the LiDAR sensor is proposed. The method is usable with any modality, as long as the calibration pattern is detected. A methodology for establishing a parallax-aware pixel mapping between different camera modalities is then presented. Such a mapping can then be used to transfer annotations, features, and results between highly differing camera modalities to facilitate feature extraction and deep detection and segmentation methods.

Keywords: USV; annotation; autonomous vehicle; calibration; multimodal system.

MeSH terms

  • Autonomous Vehicles*
  • Awareness*
  • Calibration
  • Refraction, Ocular