Feature Selection With $\ell_{2,1-2}$ Regularization

IEEE Trans Neural Netw Learn Syst. 2018 Oct;29(10):4967-4982. doi: 10.1109/TNNLS.2017.2785403. Epub 2018 Jan 15.

Abstract

Feature selection aims to select a subset of features from high-dimensional data according to a predefined selecting criterion. Sparse learning has been proven to be a powerful technique in feature selection. Sparse regularizer, as a key component of sparse learning, has been studied for several years. Although convex regularizers have been used in many works, there are some cases where nonconvex regularizers outperform convex regularizers. To make the process of selecting relevant features more effective, we propose a novel nonconvex sparse metric on matrices as the sparsity regularization in this paper. The new nonconvex regularizer could be written as the difference of the $\ell _{2,1}$ norm and the Frobenius ( $\ell _{2,2}$ ) norm, which is named the $\ell _{2,1-2}$ . To find the solution of the resulting nonconvex formula, we design an iterative algorithm in the framework of ConCave-Convex Procedure (CCCP) and prove its strong global convergence. An adopted alternating direction method of multipliers is embedded to solve the sequence of convex subproblems in CCCP efficiently. Using the scaled cluster indictors of data points as pseudolabels, we also apply $\ell _{2,1-2}$ to the unsupervised case. To the best of our knowledge, it is the first work considering nonconvex regularization for matrices in the unsupervised learning scenario. Numerical experiments are performed on real-world data sets to demonstrate the effectiveness of the proposed method.

Publication types

  • Research Support, Non-U.S. Gov't