Embedding graphs on Grassmann manifold

Neural Netw. 2022 Aug:152:322-331. doi: 10.1016/j.neunet.2022.05.001. Epub 2022 May 10.

Abstract

Learning efficient graph representation is the key to favorably addressing downstream tasks on graphs, such as node or graph property prediction. Given the non-Euclidean structural property of graphs, preserving the original graph data's similarity relationship in the embedded space needs specific tools and a similarity metric. This paper develops a new graph representation learning scheme, namely Egg, which embeds approximated second-order graph characteristics into a Grassmann manifold. The proposed strategy leverages graph convolutions to learn hidden representations of the corresponding subspace of the graph, which is then mapped to a Grassmann point of a low dimensional manifold through truncated singular value decomposition (SVD). The established graph embedding approximates denoised correlationship of node attributes, as implemented in the form of a symmetric matrix space for Euclidean calculation. The effectiveness of Egg is demonstrated using both clustering and classification tasks at the node level and graph level. It outperforms baseline models on various benchmarks.

Keywords: Graph neural network; Grassmann manifold; Projection embedding; Subspace clustering.

MeSH terms

  • Algorithms*
  • Cluster Analysis
  • Learning*