Fusion of Deep Convolution and Shallow Features to Recognize the Severity of Wheat Fusarium Head Blight

Front Plant Sci. 2021 Jan 21:11:599886. doi: 10.3389/fpls.2020.599886. eCollection 2020.

Abstract

A fast and nondestructive method for recognizing the severity of wheat Fusarium head blight (FHB) can effectively reduce fungicide use and associated costs in wheat production. This study proposed a feature fusion method based on deep convolution and shallow features derived from the high-resolution digital Red-green-blue (RGB) images of wheat FHB at different disease severity levels. To test the robustness of the proposed method, the RGB images were taken under different influence factors including light condition, camera shooting angle, image resolution, and crop growth period. All images were preprocessed to eliminate background noises to improve recognition accuracy. The AlexNet model parameters trained by the ImageNet 2012 dataset were transferred to the test dataset to extract the deep convolution feature of wheat FHB. Next, the color and texture features of wheat ears were extracted as shallow features. Then, the Relief-F algorithm was used to fuse the deep convolution feature and shallow features as the final FHB features. Finally, the random forest was used to classify and identify the features of different FHB severity levels. Results show that the recognition accuracy of the proposed fusion feature model was higher than those of models using other features in all conditions. The highest recognition accuracy of severity levels was obtained when images were taken under indoor conditions, with high resolution (12 MB pixels), at 90° shooting angle during the crop filling period. The Relief-F algorithm assigned different weights to the features under different influence factors; it made the fused feature model more robust and improved the ability to recognize wheat FHB severity levels using RGB images.

Keywords: Fusarium head blight; Relief-F; fusion feature; random forest; transfer learning.