Clenshaw Graph Neural Networks

10/29/2022
by   Yuhe Guo, et al.
0

Graph Convolutional Networks (GCNs), which use a message-passing paradigm with stacked convolution layers, are foundational methods for learning graph representations. Recent GCN models use various residual connection techniques to alleviate the model degradation problem such as over-smoothing and gradient vanishing. Existing residual connection techniques, however, fail to make extensive use of underlying graph structure as in the graph spectral domain, which is critical for obtaining satisfactory results on heterophilic graphs. In this paper, we introduce ClenshawGCN, a GNN model that employs the Clenshaw Summation Algorithm to enhance the expressiveness of the GCN model. ClenshawGCN equips the standard GCN model with two straightforward residual modules: the adaptive initial residual connection and the negative second-order residual connection. We show that by adding these two residual modules, ClenshawGCN implicitly simulates a polynomial filter under the Chebyshev basis, giving it at least as much expressive power as polynomial spectral GNNs. In addition, we conduct comprehensive experiments to demonstrate the superiority of our model over spatial and spectral GNN models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/12/2019

GResNet: Graph Residual Network for Reviving Deep GNNs from Suspended Animation

The existing graph neural networks (GNNs) based on the spectral graph co...
research
06/09/2022

Model Degradation Hinders Deep Graph Neural Networks

Graph Neural Networks (GNNs) have achieved great success in various grap...
research
04/29/2020

Directed Graph Convolutional Network

Graph Convolutional Networks (GCNs) have been widely used due to their o...
research
03/24/2023

LONGNN: Spectral GNNs with Learnable Orthonormal Basis

In recent years, a plethora of spectral graph neural networks (GNN) meth...
research
11/21/2019

Discrete and Continuous Deep Residual Learning Over Graphs

In this paper we propose the use of continuous residual modules for grap...
research
03/03/2021

Wide Graph Neural Networks: Aggregation Provably Leads to Exponentially Trainability Loss

Graph convolutional networks (GCNs) and their variants have achieved gre...
research
07/11/2019

Understanding the Representation Power of Graph Neural Networks in Learning Graph Topology

To deepen our understanding of graph neural networks, we investigate the...

Please sign up or login with your details

Forgot password? Click here to reset