Scalable and Effective Temporal Graph Representation Learning With Hyperbolic Geometry

IEEE Trans Neural Netw Learn Syst. 2024 May 10:PP. doi: 10.1109/TNNLS.2024.3394161. Online ahead of print.

Abstract

Real-life graphs often exhibit intricate dynamics that evolve continuously over time. To effectively represent continuous-time dynamic graphs (CTDGs), various temporal graph neural networks (TGNNs) have been developed to model their dynamics and topological structures in Euclidean space. Despite their notable achievements, the performance of Euclidean-based TGNNs is limited and bounded by the representation capabilities of Euclidean geometry, particularly for complex graphs with hierarchical and power-law structures. This is because Euclidean space does not have enough room (its volume grows polynomially with respect to radius) to learn hierarchical structures that expand exponentially. As a result, this leads to high-distortion embeddings and suboptimal temporal graph representations. To break the limitations and enhance the representation capabilities of TGNNs, in this article, we propose a scalable and effective TGNN with hyperbolic geometries for CTDG representation (called STGNh ). It captures evolving behaviors and stores hierarchical structures simultaneously by integrating a memory-based module and a structure-based module into a unified framework, which can scale to billion-scale graphs. Concretely, a simple hyperbolic update gate (HuG) is designed as the memory-based module to store temporal dynamics efficiently; for the structure-based module, we propose an effective hyperbolic temporal Transformer (HyT) model to capture complex graph structures and generate up-to-date node embeddings. Extensive experimental results on a variety of medium-scale and billion-scale graphs demonstrate the superiority of the proposed STGNh for CTDG representation, as it significantly outperforms baselines in various downstream tasks.