Discrete and Parameter-Free Multiple Kernel k-Means

IEEE Trans Image Process. 2022:31:2796-2808. doi: 10.1109/TIP.2022.3141612. Epub 2022 Apr 5.

Abstract

The multiple kernel k -means (MKKM) and its variants utilize complementary information from different sources, achieving better performance than kernel k -means (KKM). However, the optimization procedures of most previous works comprise two stages, learning the continuous relaxation matrix and obtaining the discrete one by extra discretization procedures. Such a two-stage strategy gives rise to a mismatched problem and severe information loss. Even worse, most existing MKKM methods overlook the correlation among prespecified kernels, which leads to the fusion of mutually redundant kernels and bad effects on the diversity of information sources, finally resulting in unsatisfying results. To address these issues, we elaborate a novel Discrete and Parameter-free Multiple Kernel k -means (DPMKKM) model solved by an alternative optimization method, which can directly obtain the cluster assignment results without subsequent discretization procedure. Moreover, DPMKKM can measure the correlation among kernels by implicitly introducing a regularization term, which is able to enhance kernel fusion by reducing redundancy and improving diversity. Noteworthily, the time complexity of optimization algorithm is successfully reduced, through masterly utilizing of coordinate descent technique, which contributes to higher algorithm efficiency and broader applications. What's more, our proposed model is parameter-free avoiding intractable hyperparameter tuning, which makes it feasible in practical applications. Lastly, extensive experiments conducted on a number of real-world datasets illustrated the effectiveness and superiority of the proposed DPMKKM model.