Sparse SVM for Sufficient Data Reduction

IEEE Trans Pattern Anal Mach Intell. 2022 Sep;44(9):5560-5571. doi: 10.1109/TPAMI.2021.3075339. Epub 2022 Aug 4.

Abstract

Kernel-based methods for support vector machines (SVM) have shown highly advantageous performance in various applications. However, they may incur prohibitive computational costs for large-scale sample datasets. Therefore, data reduction (reducing the number of support vectors) appears to be necessary, which gives rise to the topic of the sparse SVM. Motivated by this problem, the sparsity constrained kernel SVM optimization has been considered in this paper in order to control the number of support vectors. Based on the established optimality conditions associated with the stationary equations, a Newton-type method is developed to handle the sparsity constrained optimization. This method is found to enjoy the one-step convergence property if the starting point is chosen to be close to a local region of a stationary point, thereby leading to a super-high computational speed. Numerical comparisons with several powerful solvers demonstrate that the proposed method performs exceptionally well, particularly for large-scale datasets in terms of a much lower number of support vectors and shorter computational time.