Autonomous drone hunter operating by deep learning and all-onboard computations in GPS-denied environments

PLoS One. 2019 Nov 18;14(11):e0225092. doi: 10.1371/journal.pone.0225092. eCollection 2019.

Abstract

This paper proposes a UAV platform that autonomously detects, hunts, and takes down other small UAVs in GPS-denied environments. The platform detects, tracks, and follows another drone within its sensor range using a pre-trained machine learning model. We collect and generate a 58,647-image dataset and use it to train a Tiny YOLO detection algorithm. This algorithm combined with a simple visual-servoing approach was validated on a physical platform. Our platform was able to successfully track and follow a target drone at an estimated speed of 1.5 m/s. Performance was limited by the detection algorithm's 77% accuracy in cluttered environments and the frame rate of eight frames per second along with the field of view of the camera.

Publication types

  • Research Support, U.S. Gov't, Non-P.H.S.

MeSH terms

  • Algorithms
  • Color
  • Deep Learning*
  • Geographic Information Systems*
  • Image Processing, Computer-Assisted

Grants and funding

This work was supported by U.S. National Science Foundation National Robotics Initiative grant number 1527232 (M. A. Gore, R. J. Nelson, and H. Lipson) http://www.nsf.org/. The funders had no role in study design, data collection and analysis, decision to publish, or preparation of the manuscript.