Efficient and Robust Learning for Sustainable and Reacquisition-Enabled Hand Tracking

IEEE Trans Cybern. 2016 Apr;46(4):945-58. doi: 10.1109/TCYB.2015.2418275. Epub 2015 Apr 17.

Abstract

The use of machine learning approaches for long-term hand tracking poses some major challenges such as attaining robustness to inconsistencies in lighting, scale and object appearances, background clutter, and total object occlusion/disappearance. To address these issues in this paper, we present a robust machine learning approach based on enhanced particle filter trackers. The inherent drawbacks associated with the particle filter approach, i.e., sample degeneration and sample impoverishment, are minimized by infusing the particle filter with the mean shift approach. Moreover, to instill our tracker with reacquisition ability, we propose a rotation invariant and efficient detection framework named beta histograms of oriented gradients. Our robust appearance model operates on the red, green, blue color histogram and our newly proposed rotation invariant noise compensated local binary patterns descriptor, which is a noise compensated, rotation invariant version of the local binary patterns descriptor. Through our experiments, we demonstrate that our proposed hand tracker performs favorably against state-of-the-art algorithms on numerous challenging video sequences of hand postures, and overcomes the largely unsolved problem of redetecting hands after they vanish and reappear into the frame.

Publication types

  • Research Support, Non-U.S. Gov't