Indoor Visual-Based Localization System for Multi-Rotor UAVs

Sensors (Basel). 2022 Aug 3;22(15):5798. doi: 10.3390/s22155798.

Abstract

Industry 4.0, smart homes, and the Internet of Things are boosting the employment of autonomous aerial vehicles in indoor environments, where localization is still challenging, especially in the case of close and cluttered areas. In this paper, we propose a Visual Inertial Odometry localization method based on fiducial markers. Our approach enables multi-rotor aerial vehicle navigation in indoor environments and tackles the most challenging aspects of image-based indoor localization. In particular, we focus on a proper and continuous pose estimation, working from take-off to landing, at several different flying altitudes. With this aim, we designed a map of fiducial markers that produces results that are both dense and heterogeneous. Narrowly placed tags lead to minimal information loss during rapid aerial movements while four different classes of marker size provide consistency when the camera zooms in or out according to the vehicle distance from the ground. We have validated our approach by comparing the output of the localization algorithm with the ground-truth information collected through an optoelectronic motion capture system, using two different platforms in different flying conditions. The results show that error mean and standard deviation can remain constantly lower than 0.11 m, so not degrading when the aerial vehicle increases its altitude and, therefore, strongly improving similar state-of-the-art solutions.

Keywords: Visual Inertial Odometry; aerial vehicles; fiducial markers; indoor localization.

MeSH terms

  • Algorithms*
  • Altitude
  • Internet
  • Motion
  • Movement*

Grants and funding

This work was partly supported by by the University of Padova under the BIRD-SEED CAR.