Learning Sparse and Identity-Preserved Hidden Attributes for Person Re-Identification

IEEE Trans Image Process. 2020;29(1):2013-2025. doi: 10.1109/TIP.2019.2946975. Epub 2019 Oct 17.

Abstract

Person re-identification (Re-ID) aims at matching person images captured in non-overlapping camera views. To represent person appearance, low-level visual features are sensitive to environmental changes, while high-level semantic attributes, such as "short-hair" or "long-hair", are relatively stable. Hence, researches have started to design semantic attributes to reduce the visual ambiguity. However, to train a prediction model for semantic attributes, it requires plenty of annotations, which are hard to obtain in practical large-scale applications. To alleviate the reliance on annotation efforts, we propose to incrementally generate Deep Hidden Attribute (DHA) based on baseline deep network for newly uncovered annotations. In particular, we propose an auto-encoder model that can be plugged into any deep network to mine latent information in an unsupervised manner. To optimize the effectiveness of DHA, we reform the auto-encoder model with additional orthogonal generation module, along with identity-preserving and sparsity constraints. 1) Orthogonally generating: In order to make DHAs different from each other, Singular Vector Decomposition (SVD) is introduced to generate DHAs orthogonally. 2) Identity-preserving constraint: The generated DHAs should be distinct for telling different persons, so we associate DHAs with person identities. 3) Sparsity constraint: To enhance the discriminability of DHAs, we also introduce the sparsity constraint to restrict the number of effective DHAs for each person. Experiments conducted on public datasets have validated the effectiveness of the proposed network. On two large-scale datasets, i.e., Market-1501 and DukeMTMC-reID, the proposed method outperforms the state-of-the-art methods.

MeSH terms

  • Biometric Identification / methods*
  • Deep Learning*
  • Female
  • Humans
  • Image Processing, Computer-Assisted / methods*
  • Male