DPSSD: Dual-Path Single-Shot Detector

Sensors (Basel). 2022 Jun 18;22(12):4616. doi: 10.3390/s22124616.

Abstract

Object detection is one of the most important and challenging branches of computer vision. It has been widely used in people's lives, such as for surveillance security and autonomous driving. We propose a novel dual-path multi-scale object detection paradigm in order to extract more abundant feature information for the object detection task and optimize the multi-scale object detection problem, and based on this, we design a single-stage general object detection algorithm called Dual-Path Single-Shot Detector (DPSSD). The dual path ensures that shallow features, i.e., residual path and concatenation path, can be more easily utilized to improve detection accuracy. Our improved dual-path network is more adaptable to multi-scale object detection tasks, and we combine it with the feature fusion module to generate a multi-scale feature learning paradigm called the "Dual-Path Feature Pyramid". We trained the models on PASCAL VOC datasets and COCO datasets with 320 pixels and 512 pixels input, respectively, and performed inference experiments to validate the structures in the neural network. The experimental results show that our algorithm has an advantage over anchor-based single-stage object detection algorithms and achieves an advanced level in average accuracy. Researchers can replicate the reported results of this paper.

Keywords: convolution neural networks; multi-scale; object detection; single-stage.

MeSH terms

  • Algorithms
  • Automobile Driving*
  • Disease Progression
  • Humans
  • Learning
  • Neural Networks, Computer*

Grants and funding

This research was funded by the Project of Shandong Provincial Major Scientific and Technological Innovation, grant No. 2019JZZY010444, No. 2019TSLH0315; in part by the Project of 20 Policies of Facilitate Scientific Research in Jinan Colleges, grant No. 2019GXRC063; in part by the Project of Shandong Province Higher Educational Science and Technology Program, grant No. J18KA345; and in part by the Natural Science Foundation of Shandong Province of China, grant No. ZR2020MF138.