Agricultural plant cataloging and establishment of a data framework from UAV-based crop images by computer vision

Gigascience. 2022 Jun 17:11:giac054. doi: 10.1093/gigascience/giac054.

Abstract

Background: Unmanned aerial vehicle (UAV)-based image retrieval in modern agriculture enables gathering large amounts of spatially referenced crop image data. In large-scale experiments, however, UAV images suffer from containing a multitudinous amount of crops in a complex canopy architecture. Especially for the observation of temporal effects, this complicates the recognition of individual plants over several images and the extraction of relevant information tremendously.

Results: In this work, we present a hands-on workflow for the automatized temporal and spatial identification and individualization of crop images from UAVs abbreviated as "cataloging" based on comprehensible computer vision methods. We evaluate the workflow on 2 real-world datasets. One dataset is recorded for observation of Cercospora leaf spot-a fungal disease-in sugar beet over an entire growing cycle. The other one deals with harvest prediction of cauliflower plants. The plant catalog is utilized for the extraction of single plant images seen over multiple time points. This gathers a large-scale spatiotemporal image dataset that in turn can be applied to train further machine learning models including various data layers.

Conclusion: The presented approach improves analysis and interpretation of UAV data in agriculture significantly. By validation with some reference data, our method shows an accuracy that is similar to more complex deep learning-based recognition techniques. Our workflow is able to automatize plant cataloging and training image extraction, especially for large datasets.

Keywords: UAV imaging; plant identification; plant individualization; precision agriculture; remote sensing.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Agriculture* / methods
  • Computers
  • Crops, Agricultural
  • Remote Sensing Technology* / methods