Fully convolutional network for rice seedling and weed image segmentation at the seedling stage in paddy fields

PLoS One. 2019 Apr 18;14(4):e0215676. doi: 10.1371/journal.pone.0215676. eCollection 2019.

Abstract

To reduce the cost of production and the pollution of the environment that is due to the overapplication of herbicide in paddy fields, the location information of rice seedlings and weeds must be detected in site-specific weed management (SSWM). With the development of deep learning, a semantic segmentation method with the SegNet that is based on fully convolutional network (FCN) was proposed. In this paper, RGB color images of seedling rice were captured in paddy field, and ground truth (GT) images were obtained by manually labeled the pixels in the RGB images with three separate categories, namely, rice seedlings, background, and weeds. The class weight coefficients were calculated to solve the problem of the unbalance of the number of the classification category. GT images and RGB images were used for data training and data testing. Eighty percent of the samples were randomly selected as the training dataset and 20% of samples were used as the test dataset. The proposed method was compared with a classical semantic segmentation model, namely, FCN, and U-Net models. The average accuracy rate of the SegNet method was 92.7%, whereas the average accuracy rates of the FCN and U-Net methods were 89.5% and 70.8%, respectively. The proposed SegNet method realized higher classification accuracy and could effectively classify the pixels of rice seedlings, background, and weeds in the paddy field images and acquire the positions of their regions.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Crop Production*
  • Image Processing, Computer-Assisted*
  • Neural Networks, Computer*
  • Oryza / growth & development*
  • Plant Weeds / growth & development*
  • Seedlings / growth & development*

Grants and funding

The research was funded by the Earmarked Fund for Modern Agro-industry Technology Research System (Grant No. CARS-01-43) and the National Science Foundation for Young Scientists of China (Grant No. 31801258).