An Unmanned Aerial Vehicle Indoor Low-Computation Navigation Method Based on Vision and Deep Learning

Sensors (Basel). 2023 Dec 28;24(1):190. doi: 10.3390/s24010190.

Abstract

Recently, unmanned aerial vehicles (UAVs) have found extensive indoor applications. In numerous indoor UAV scenarios, navigation paths remain consistent. While many indoor positioning methods offer excellent precision, they often demand significant costs and computational resources. Furthermore, such high functionality can be superfluous for these applications. To address this issue, we present a cost-effective, computationally efficient solution for path following and obstacle avoidance. The UAV employs a down-looking camera for path following and a front-looking camera for obstacle avoidance. This paper refines the carrot casing algorithm for line tracking and introduces our novel line-fitting path-following algorithm (LFPF). Both algorithms competently manage indoor path-following tasks within a constrained field of view. However, the LFPF is superior at adapting to light variations and maintaining a consistent flight speed, maintaining its error margin within ±40 cm in real flight scenarios. For obstacle avoidance, we utilize depth images and YOLOv4-tiny to detect obstacles, subsequently implementing suitable avoidance strategies based on the type and proximity of these obstacles. Real-world tests indicated minimal computational demands, enabling the Nvidia Jetson Nano, an entry-level computing platform, to operate at 23 FPS.

Keywords: indoor; obstacle avoidance; path following; unmanned aerial vehicles (UAV).