NCSiam: Reliable Matching via Neighborhood Consensus for Siamese-Based Object Tracking

IEEE Trans Image Process. 2023:32:6168-6182. doi: 10.1109/TIP.2023.3329669. Epub 2023 Nov 14.

Abstract

An essential need for accurate visual object tracking is to capture better correlations between the tracking target and the search region. However, the dominant Siamese-based trackers are limited to producing dense similarity maps at once via a cross-correlations operation, ignoring to remedy the contamination caused by erroneous or ambiguous matches. In this paper, we propose a novel tracker, termed neighborhood consensus constraint-based siamese tracker (NCSiam), which takes the idea of neighborhood consensus constraint to refine the produced correlation maps. The intuition behind our approach is that we can support the nearby erroneous or ambiguous matches by analyzing a larger context of the scene that contains a unique match. Specifically, we devise a 4D convolution-based multi-level similarity refinement (MLSR) strategy. Taking the primary similarity maps obtained from a cross-correlation as input, MLSR acquires reliable matches by analyzing neighborhood consensus patterns in 4D space, thus enhancing the discriminability between the tracking target and the distractors. Besides, traditional Siamese-based trackers directly perform classification and regression on similarity response maps which discard appearance or semantic information. Therefore, an appearance affinity decoder (AAD) is developed to take full advantage of the semantic information of the search region. To further improve performance, we design a task-specific disentanglement (TSD) module to decouple the learned representations into classification-specific and regression-specific embeddings. Extensive experiments are conducted on six challenging benchmarks, including GOT-10k, TrackingNet, LaSOT, UAV123, OTB2015, and VOT2020. The results demonstrate the effectiveness of our method. The code will be available at https://github.com/laybebe/NCSiam.