Siamese network with a depthwise over-parameterized convolutional layer for visual tracking

PLoS One. 2022 Aug 31;17(8):e0273690. doi: 10.1371/journal.pone.0273690. eCollection 2022.

Abstract

Visual tracking is a fundamental research task in vision computer. It has broad application prospects, such as military defense and civil security. Visual tracking encounters many challenges in practical application, such as occlusion, fast motion and background clutter. Siamese based trackers achieve superior tracking performance in balanced accuracy and tracking speed. The deep feature extraction with Convolutional Neural Network (CNN) is an essential component in Siamese tracking framework. Although existing trackers take full advantage of deep feature information, the spatial structure and semantic information are not adequately exploited, which are helpful for enhancing target representations. The lack of these spatial and semantic information may lead to tracking drift. In this paper, we design a CNN feature extraction subnetwork based on a Depthwise Over-parameterized Convolutional layer (DO-Conv). A joint convolution method is introduced, namely the conventional and depthwise convolution. The depthwise convolution kernel explores independent channel information, which effectively extracts shallow spatial information and deep semantic information, and discards background information. Based on DO-Conv, we propose a novel tracking algorithm in Siamese framework (named DOSiam). Extensive experiments conducted on five benchmarks including OTB2015, VOT2016, VOT2018, GOT-10k and VOT2019-RGBT(TIR) show that the proposed DOSiam achieves leading tracking performance with real-time tracking speed at 60 FPS against state-of-the-art trackers.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms*
  • Motion
  • Neural Networks, Computer*
  • Semantics

Grants and funding

Yuanyun Wang, Wenshuang Zhang, Limin Zhang are funded by the Jiangxi Science and Technology Research Project of Education within the Department of China (No: GJJ190955), and the National Natural Science Foundation of China (No: 61861032) for the study design, the experiments and the paper publishing. Jun Wang is funded by the National Natural Science Foundation of China (No: 61865012) for the study and the publication.