Kernel Correntropy Conjugate Gradient Algorithms Based on Half-Quadratic Optimization

IEEE Trans Cybern. 2021 Nov;51(11):5497-5510. doi: 10.1109/TCYB.2019.2959834. Epub 2021 Nov 9.

Abstract

As a nonlinear similarity measure defined in the kernel space, the correntropic loss (C-Loss) can address the stability issues of second-order similarity measures thanks to its ability to extract high-order statistics of data. However, the kernel adaptive filter (KAF) based on the C-Loss uses the stochastic gradient descent (SGD) method to update its weights and, thus, suffers from poor performance and a slow convergence rate. To address these issues, the conjugate gradient (CG)-based correntropy algorithm is developed by solving the combination of half-quadratic (HQ) optimization and weighted least-squares (LS) problems, generating a novel robust kernel correntropy CG (KCCG) algorithm. The proposed KCCG with less computational complexity achieves comparable performance to the kernel recursive maximum correntropy (KRMC) algorithm. To further curb the growth of the network in KCCG, the random Fourier features KCCG (RFFKCCG) algorithm is proposed by transforming the original input data into a fixed-dimensional random Fourier features space (RFFS). Since only one current error information is used in the loss function of RFFKCCG, it can provide a more efficient filter structure than the other KAFs with sparsification. The Monte Carlo simulations conducted in the prediction of synthetic and real-world chaotic time series and the regression for large-scale datasets validate the superiorities of the proposed algorithms in terms of robustness, filtering accuracy, and complexity.