Global Search and Analysis for the Nonconvex Two-Level ℓ₁ Penalty

IEEE Trans Neural Netw Learn Syst. 2024 Mar;35(3):3886-3899. doi: 10.1109/TNNLS.2022.3201052. Epub 2024 Feb 29.

Abstract

Imposing suitably designed nonconvex regularization is effective to enhance sparsity, but the corresponding global search algorithm has not been well established. In this article, we propose a global search algorithm for the nonconvex two-level l1 penalty based on its piecewise linear property and apply it to machine learning tasks. With the search capability, the optimization performance of the proposed algorithm could be improved, resulting in better sparsity and accuracy than most state-of-the-art global and local algorithms. Besides, we also provide an approximation analysis to demonstrate the effectiveness of our global search algorithm in sparse quantile regression.