Multikernel Correntropy for Robust Learning

IEEE Trans Cybern. 2022 Dec;52(12):13500-13511. doi: 10.1109/TCYB.2021.3110732. Epub 2022 Nov 18.

Abstract

As a novel similarity measure that is defined as the expectation of a kernel function between two random variables, correntropy has been successfully applied in robust machine learning and signal processing to combat large outliers. The kernel function in correntropy is usually a zero-mean Gaussian kernel. In a recent work, the concept of mixture correntropy (MC) was proposed to improve the learning performance, where the kernel function is a mixture Gaussian kernel, namely, a linear combination of several zero-mean Gaussian kernels with different widths. In both correntropy and MC, the center of the kernel function is, however, always located at zero. In the present work, to further improve the learning performance, we propose the concept of multikernel correntropy (MKC), in which each component of the mixture Gaussian kernel can be centered at a different location. The properties of the MKC are investigated and an efficient approach is proposed to determine the free parameters in MKC. Experimental results show that the learning algorithms under the maximum MKC criterion (MMKCC) can outperform those under the original maximum correntropy criterion (MCC) and the maximum MC criterion (MMCC).

MeSH terms

  • Algorithms*
  • Machine Learning
  • Signal Processing, Computer-Assisted*