A comparative study of the effectiveness of using popular DNN object detection algorithms for pith detection in cross-sectional images of parawood

Heliyon. 2020 Feb 28;6(2):e03480. doi: 10.1016/j.heliyon.2020.e03480. eCollection 2020 Feb.

Abstract

The location of pith in a cross-sectional surface of wood can be used to either evaluate its quality or guide the removal of soft wood from the wood stem. There have been many attempts to automate pith detection in images taken by a normal camera. The objective of this study is to comparatively study the effectiveness of two popular deep neural network (DNN) object detection algorithms for parawood pith detection in cross-sectional wood images. In the experiment, a database of 345 cross-sectional images of parawood, taken by a normal camera within a sawmill environment, was quadrupled in size via image augmentation. The images were then manually annotated to label the pith regions. The dataset was used to train two DNN object detection algorithms, an SSD (single shot detector) MobileNet and you-only-look-once (YOLO), via transfer learning. The inference results, utilizing pretrained models obtained by minimizing a loss function in both algorithms, were obtained on a separate dataset of 215 images and compared. The detection rate and average location error with respect to the ground truth were used to evaluate the effectiveness of detection. Additionally, the average distance error results were compared with the results of a state-of-the-art non-DNN algorithm. SSD MobileNet obtained the best detection rate of 87.7% with a ratio of training to test data of 80:20 and 152,000 training iterations. The average distance error of SSD MobileNet is comparable to that of YOLO and six times better than that of the non-DNN algorithm. Hence, SSD MobileNet is an effective approach to automating parawood pith detection in cross-sectional images.

Keywords: Computer science; Deep neural networks object detection; Parawood pith location; SSD MobileNet; Wood pith detection; You-only-look-once.