Texture-Independent Long-Term Tracking Using Virtual Corners

IEEE Trans Image Process. 2016 Jan;25(1):359-71. doi: 10.1109/TIP.2015.2497141. Epub 2015 Nov 2.

Abstract

Long-term tracking of an object, given only a single instance in an initial frame, remains an open problem. We propose a visual tracking algorithm, robust to many of the difficulties that often occur in real-world scenes. Correspondences of edge-based features are used, to overcome the reliance on the texture of the tracked object and improve invariance to lighting. Furthermore, we address long-term stability, enabling the tracker to recover from drift and to provide redetection following object disappearance or occlusion. The two-module principle is similar to the successful state-of-the-art long-term TLD tracker; however, our approach offers better performance in benchmarks and extends to cases of low-textured objects. This becomes obvious in cases of plain objects with no texture at all, where the edge-based approach proves the most beneficial. We perform several different experiments to validate the proposed method. First, results on short-term sequences show the performance of tracking challenging (low textured and/or transparent) objects that represent failure cases for competing the state-of-the-art approaches. Second, long sequences are tracked, including one of almost 30 000 frames, which, to the best of our knowledge, is the longest tracking sequence reported to date. This tests the redetection and drift resistance properties of the tracker. Finally, we report the results of the proposed tracker on the VOT Challenge 2013 and 2014 data sets as well as on the VTB1.0 benchmark, and we show relative performance of the tracker compared with its competitors. All the results are comparable with the state of the art on sequences with textured objects and superior on non-textured objects. The new annotated sequences are made publicly available.

Publication types

  • Research Support, Non-U.S. Gov't