Free-view gait recognition

PLoS One. 2019 Apr 16;14(4):e0214389. doi: 10.1371/journal.pone.0214389. eCollection 2019.

Abstract

Human gait has been shown to be an effective biometric measure for person identification at a distance. On the other hand, changes in the view angle pose a major challenge for gait recognition as human gait silhouettes are usually different from different view angles. Traditionally, such a multi-view gait recognition problem can be tackled by View Transformation Model (VTM) which transforms gait features from multiple gallery views to the probe view so as to evaluate the gait similarity. In the real-world environment, however, gait sequences may be captured from an uncontrolled scene and the view angle is often unknown, dynamically changing, or does not belong to any predefined views (thus VTM becomes inapplicable). To address this free-view gait recognition problem, we propose an innovative view-adaptive mapping (VAM) approach. The VAM employs a novel walking trajectory fitting (WTF) to estimate the view angles of a gait sequence, and a joint gait manifold (JGM) to find the optimal manifold between the probe data and relevant gallery data for gait similarity evaluation. Additionally, a RankSVM-based algorithm is developed to supplement the gallery data for subjects whose gallery features are only available in predefined views. Extensive experiments on both indoor and outdoor datasets demonstrate that the VAM outperforms several reference methods remarkably in free-view gait recognition.

Publication types

  • Research Support, Non-U.S. Gov't

MeSH terms

  • Algorithms
  • Biometry
  • Facial Recognition / physiology
  • Gait / physiology*
  • Gait Analysis / methods*
  • Humans
  • Pattern Recognition, Automated*
  • Walking / physiology*

Grants and funding

This study was partially supported by grants from the National Basic Research Program of China under grant 2015CB351806, the National Natural Science Foundation of China under contract No. U1611461, No. 61825101, and Shenzhen Municipal Science and Technology Program under Grant JCYJ20170818141146428. There was no additional external funding received for this study.