Graph Neural Network Meets Sparse Representation: Graph Sparse Neural Networks via Exclusive Group Lasso

IEEE Trans Pattern Anal Mach Intell. 2023 Oct;45(10):12692-12698. doi: 10.1109/TPAMI.2023.3285215. Epub 2023 Sep 5.

Abstract

Existing GNNs usually conduct the layer-wise message propagation via the 'full' aggregation of all neighborhood information which are usually sensitive to the structural noises existed in the graphs, such as incorrect or undesired redundant edge connections. To overcome this issue, we propose to exploit Sparse Representation (SR) theory into GNNs and propose Graph Sparse Neural Networks (GSNNs) which conduct sparse aggregation to select reliable neighbors for message aggregation. GSNNs problem contains discrete/sparse constraint which is difficult to be optimized. Thus, we then develop a tight continuous relaxation model Exclusive Group Lasso GNNs (EGLassoGNNs) for GSNNs. An effective algorithm is derived to optimize the proposed EGLassoGNNs model. Experimental results on several benchmark datasets demonstrate the better performance and robustness of the proposed EGLassoGNNs model.