DropAGG: Robust Graph Neural Networks via Drop Aggregation

Neural Netw. 2023 Jun:163:65-74. doi: 10.1016/j.neunet.2023.03.022. Epub 2023 Mar 29.

Abstract

Robust learning on graph data is an active research problem in data mining field. Graph Neural Networks (GNNs) have gained great attention in graph data representation and learning tasks. The core of GNNs is the message propagation mechanism across node's neighbors in GNNs' layer-wise propagation. Existing GNNs generally adopt the deterministic message propagation mechanism which may (1) perform non-robustly w.r.t structural noises and adversarial attacks and (2) lead to over-smoothing issue. To alleviate these issues, this work rethinks dropout techniques in GNNs and proposes a novel random message propagation mechanism, named Drop Aggregation (DropAGG), for GNNs learning. The core of DropAGG is to randomly select a certain rate of nodes to participate in information aggregation. The proposed DropAGG is a general scheme which can incorporate any specific GNN model to enhance its robustness and mitigate the over-smoothing issue. Using DropAGG, we then design a novel Graph Random Aggregation Network (GRANet) for graph data robust learning. Extensive experiments on several benchmark datasets demonstrate the robustness of GRANet and effectiveness of DropAGG to mitigate the issue of over-smoothing.

Keywords: Drop aggregation; Graph neural networks; Graph random aggregation network; Robust data learning.

MeSH terms

  • Benchmarking*
  • Data Mining
  • Learning*
  • Neural Networks, Computer