Kernel Error Path Algorithm

IEEE Trans Neural Netw Learn Syst. 2023 Nov;34(11):8866-8878. doi: 10.1109/TNNLS.2022.3153953. Epub 2023 Oct 27.

Abstract

Tuning the values of kernel parameters plays a vital role in the performance of kernel methods. Kernel path algorithms have been proposed for several important learning algorithms, including support vector machine and kernelized Lasso, which can fit the piecewise nonlinear solutions of kernel methods with respect to the kernel parameter in a continuous space. Although the error path algorithms have been proposed to ensure that the model with the minimum cross validation (CV) error can be found, which is usually the ultimate goal of model selection, they are limited to piecewise linear solution paths. To address this problem, in this article, we extend the classic error path algorithm to the nonlinear kernel solution paths and propose a new kernel error path algorithm (KEP) that can find the global optimal kernel parameter with the minimum CV error. Specifically, we first prove that error functions of binary classification and regression problems are piecewise constant or smooth w.r.t. the kernel parameter. Then, we propose KEP for support vector machine and kernelized Lasso and prove that it guarantees to find the model with the minimum CV error within the whole range of kernel parameter values. Experimental results on various datasets show that our KEP can find the model with minimum CV error with less time consumption. Finally, it would have better generalization error on the test set, compared with grid search and random search.