Remote sensing for field pea yield estimation: A study of multi-scale data fusion approaches in phenomics

Front Plant Sci. 2023 Mar 3:14:1111575. doi: 10.3389/fpls.2023.1111575. eCollection 2023.

Abstract

Introduction: Remote sensing using unmanned aerial systems (UAS) are prevalent for phenomics and precision agricultural applications. The high-resolution data for these applications can provide useful spectral characteristics of crops associated with performance traits such as seed yield. With the recent availability of high-resolution satellite imagery, there has been growing interest in using this technology for plot-scale remote sensing applications, particularly those related to breeding programs. This study compared the features extracted from high-resolution satellite and UAS multispectral imagery (visible and near-infrared) to predict the seed yield from two diverse plot-scale field pea yield trials (advanced breeding and variety testing) using the random forest model.

Methods: The multi-modal (spectral and textural features) and multi-scale (satellite and UAS) data fusion approaches were evaluated to improve seed yield prediction accuracy across trials and time points. These approaches included both image fusion, such as pan-sharpening of satellite imagery with UAS imagery using intensity-hue-saturation transformation and additive wavelet luminance proportional approaches, and feature fusion, which involved integrating extracted spectral features. In addition, we also compared the image fusion approach to high-definition satellite data with a resolution of 0.15 m/pixel. The effectiveness of each approach was evaluated with data at both individual and combined time points.

Results and discussion: The major findings can be summarized as follows: (1) the inclusion of the texture features did not improve the model performance, (2) the performance of the model using spectral features from satellite imagery at its original resolution can provide similar results as UAS imagery, with variation depending on the field pea yield trial under study and the growth stage, (3) the model performance improved after applying multi-scale, multiple time point feature fusion, (4) the features extracted from the pan-sharpened satellite imagery using intensity-hue-saturation transformation (image fusion) showed better model performance than those with original satellite imagery or high definition imagery, and (5) the green normalized difference vegetation index and transformed triangular vegetation index were identified as key features contributing to high model performance across trials and time points. These findings demonstrate the potential of high-resolution satellite imagery and data fusion approaches for plot-scale phenomics applications.

Keywords: high-resolution satellite; high-throughput field phenotyping; multispectral; pan-sharpening; plant breeding; unmanned aerial system; yield prediction.

Grants and funding

This study was funded by the US Department of Agriculture-National Institute of Food and Agriculture (USDA-NIFA) competitive projects (accession numbers 1011741, 1022033, 1028108) and hatch project (accession number 1014919). We would like to also thank Microsoft AI for Earth Grant for opportunities of cloud computing.