Efficient Proximal Gradient Algorithms for Joint Graphical Lasso

Entropy (Basel). 2021 Dec 2;23(12):1623. doi: 10.3390/e23121623.

Abstract

We consider learning as an undirected graphical model from sparse data. While several efficient algorithms have been proposed for graphical lasso (GL), the alternating direction method of multipliers (ADMM) is the main approach taken concerning joint graphical lasso (JGL). We propose proximal gradient procedures with and without a backtracking option for the JGL. These procedures are first-order methods and relatively simple, and the subproblems are solved efficiently in closed form. We further show the boundedness for the solution of the JGL problem and the iterates in the algorithms. The numerical results indicate that the proposed algorithms can achieve high accuracy and precision, and their efficiency is competitive with state-of-the-art algorithms.

Keywords: Gaussian graphical model; joint graphical lasso; proximal gradient descent method.