Learning Multiple Parameters for Kernel Collaborative Representation Classification

IEEE Trans Neural Netw Learn Syst. 2020 Dec;31(12):5068-5078. doi: 10.1109/TNNLS.2019.2962878. Epub 2020 Nov 30.

Abstract

In this article, the problem of automatically learning multiple parameters for kernel collaborative representation classification (KCRC) is considered. We investigate the KCRC and measure its generalization error via leave-one-out cross-validation (LOO-CV). By taking advantage of the specific properties of KCRC, a closed-form expression is derived for the outputs of LOO-CV. Then, a simple classification rule that provides probabilistic outputs is adopted, and thereby, an effective loss function that is an explicit function with respect to the parameters is proposed as the generalization error. The gradients of the loss function are calculated, and the parameters are learned by minimizing the loss function using a gradient-based optimization algorithm. Furthermore, the proposed approach makes it possible to solve the multiple kernel/feature learning problems of KCRC effectively. Experiment results on six data sets taken from different scenes demonstrate the effectiveness of the proposed approach.