Online Gradient Descent for Kernel-Based Maximum Correntropy Criterion

Entropy (Basel). 2019 Jun 29;21(7):644. doi: 10.3390/e21070644.

Abstract

In the framework of statistical learning, we study the online gradient descent algorithm generated by the correntropy-induced losses in Reproducing kernel Hilbert spaces (RKHS). As a generalized correlation measurement, correntropy has been widely applied in practice, owing to its prominent merits on robustness. Although the online gradient descent method is an efficient way to deal with the maximum correntropy criterion (MCC) in non-parameter estimation, there has been no consistency in analysis or rigorous error bounds. We provide a theoretical understanding of the online algorithm for MCC, and show that, with a suitable chosen scaling parameter, its convergence rate can be min-max optimal (up to a logarithmic factor) in the regression analysis. Our results show that the scaling parameter plays an essential role in both robustness and consistency.

Keywords: correntropy; maximum correntropy criterion; online algorithm; reproducing kernel Hilbert spaces; robustness.