Large-Scale Metric Learning: A Voyage From Shallow to Deep

IEEE Trans Neural Netw Learn Syst. 2018 Sep;29(9):4339-4346. doi: 10.1109/TNNLS.2017.2761773. Epub 2017 Nov 7.

Abstract

Despite its attractive properties, the performance of the recently introduced Keep It Simple and Straightforward MEtric learning (KISSME) method is greatly dependent on principal component analysis as a preprocessing step. This dependence can lead to difficulties, e.g., when the dimensionality is not meticulously set. To address this issue, we devise a unified formulation for joint dimensionality reduction and metric learning based on the KISSME algorithm. Our joint formulation is expressed as an optimization problem on the Grassmann manifold, and hence enjoys the properties of Riemannian optimization techniques. Following the success of deep learning in recent years, we also devise end-to-end learning of a generic deep network for metric learning using our derivation.