Localized Multiple Kernel Learning With Dynamical Clustering and Matrix Regularization

IEEE Trans Neural Netw Learn Syst. 2018 Feb;29(2):486-499. doi: 10.1109/TNNLS.2016.2635151. Epub 2016 Dec 20.

Abstract

Localized multiple kernel learning (LMKL) is an attractive strategy for combining multiple heterogeneous features with regard to their discriminative power for each individual sample. However, the learning of numerous local solutions may not scale well even for a moderately sized training set, and the independently learned local models may suffer from overfitting. Hence, in existing local methods, the distributed samples are typically assumed to share the same weights, and various unsupervised clustering methods are applied as preprocessing. In this paper, to enable the learner to discover and benefit from the underlying local coherence and diversity of the samples, we incorporate the clustering procedure into the canonical support vector machine-based LMKL framework. Then, to explore the relatedness among different samples, which has been ignored in a vector -norm analysis, we organize the cluster-specific kernel weights into a matrix and introduce a matrix-based extension of the -norm for constraint enforcement. By casting the joint optimization problem as a problem of alternating optimization, we show how the cluster structure is gradually revealed and how the matrix-regularized kernel weights are obtained. A theoretical analysis of such a regularizer is performed using a Rademacher complexity bound, and complementary empirical experiments on real-world data sets demonstrate the effectiveness of our technique.

Publication types

  • Research Support, Non-U.S. Gov't