Graph Neural Networks With High-Order Polynomial Spectral Filters

IEEE Trans Neural Netw Learn Syst. 2023 Apr 11:PP. doi: 10.1109/TNNLS.2023.3263676. Online ahead of print.

Abstract

General graph neural networks (GNNs) implement convolution operations on graphs based on polynomial spectral filters. Existing filters with high-order polynomial approximations can detect more structural information when reaching high-order neighborhoods but produce indistinguishable representations of nodes, which indicates their inefficiency of processing information in high-order neighborhoods, resulting in performance degradation. In this article, we theoretically identify the feasibility of avoiding this problem and attribute it to overfitting polynomial coefficients. To cope with it, the coefficients are restricted in two steps, dimensionality reduction of the coefficients' domain and sequential assignment of the forgetting factor. We transform the optimization of coefficients to the tuning of a hyperparameter and propose a flexible spectral-domain graph filter, which significantly reduces the memory demand and the adverse impacts on message transmission under large receptive fields. Utilizing our filter, the performance of GNNs is improved significantly in large receptive fields and the receptive fields of GNNs are multiplied as well. Meanwhile, the superiority of applying a high-order approximation is verified across various datasets, notably in strongly hyperbolic datasets. Codes are publicly available at: https://github.com/cengzeyuan/TNNLS-FFKSF.