Temporal Network Embedding Enhanced With Long-Range Dynamics and Self-Supervised Learning

IEEE Trans Neural Netw Learn Syst. 2024 Apr 15:PP. doi: 10.1109/TNNLS.2024.3384348. Online ahead of print.

Abstract

Temporal network embedding (TNE) has promoted the research of knowledge discovery and reasoning on networks. It aims to embed vertices of temporal networks into a low-dimensional vector space while preserving network structures and temporal properties. However, most existing methods have limitations in capturing dynamics over long distances, which makes it difficult to explore multihop topological associations among vertices. To tackle this challenge, we propose LongTNE, which learns the long-range dynamics of vertices to endow TNE with the ability to capture high-order proximity (HP) of networks. In LongTNE, we employ graph self-supervised learning (Graph SSL) to optimize the establishment probability of deep links in each network snapshot. We also present an accumulated forward update (AFU) module to fathom global temporal evolution among multiple network snapshots. The empirical results on six temporal networks demonstrate that, in addition to achieving state-of-the-art performance on network mining tasks, LongTNE can be handily extended to existing TNE methods.