SiamDF: Tracking training data-free siamese tracker

Neural Netw. 2023 Aug:165:705-720. doi: 10.1016/j.neunet.2023.06.012. Epub 2023 Jun 15.

Abstract

Much progress has been made in siamese tracking, primarily benefiting from increasing huge training data. However, very little attention has been really paid to the role of huge training data in learning an effective siamese tracker. In this study, we undertake an in-depth analysis of this issue from a novel optimization perspective, and observe that training data is particularly adept at background suppression, thereby refining target representation. Inspired by this insight, we present a data-free siamese tracking algorithm named SiamDF, which requires only a pre-trained backbone and no further fine-tuning on additional training data. Particularly, to suppress background distractors, we separately improve two branches of siamese tracking by retaining the pure target region as target input with the removal of template background, and by exploring an efficient inverse transformation to maintain the constant aspect ratio of target state in search region. Besides, we further promote the center displacement prediction of the entire backbone by eliminating its spatial stride deviations caused by convolution-like quantification operations. Our experimental results on several popular benchmarks demonstrate that SiamDF, free from both offline fine-tuning and online update, achieves impressive performance compared to well-established unsupervised and supervised tracking methods.

Keywords: Pre-training; Scale estimation; Sharing computation; Siamese tracking; Tracking training data-free.

MeSH terms

  • Algorithms*
  • Benchmarking
  • Learning*