GResNet: Graph Residual Network for Reviving Deep GNNs from Suspended Animation

09/12/2019
by   Jiawei Zhang, et al.
0

The existing graph neural networks (GNNs) based on the spectral graph convolutional operator have been criticized for its performance degradation, which is especially common for the models with deep architectures. In this paper, we further identify the suspended animation problem with the existing GNNs. Such a problem happens when the model depth reaches the suspended animation limit, and the model will not respond to the training data any more and become not learnable. Analysis about the causes of the suspended animation problem with existing GNNs will be provided in this paper, whereas several other peripheral factors that will impact the problem will be reported as well. To resolve the problem, we introduce the GResNet (Graph Residual Network) framework in this paper, which creates extensively connected highways to involve nodes' raw features or intermediate representations throughout the graph for all the model layers. Different from the other learning settings, the extensive connections in the graph data will render the existing simple residual learning methods fail to work. We prove the effectiveness of the introduced new graph residual terms from the norm preservation perspective, which will help avoid dramatic changes to the node's representations between sequential layers. Detailed studies about the GResNet framework for many existing GNNs, including GCN, GAT and LoopyNet, will be reported in the paper with extensive empirical experiments on real-world benchmark datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/12/2019

GRESNET: Graph Residuals for Reviving Deep Graph Neural Nets from Suspended Animation

In this paper, we will investigate the causes of the GNNs' "suspended an...
research
10/29/2022

Clenshaw Graph Neural Networks

Graph Convolutional Networks (GCNs), which use a message-passing paradig...
research
06/03/2023

Scaling Up, Scaling Deep: Blockwise Graph Contrastive Learning

Oversmoothing is a common phenomenon in graph neural networks (GNNs), in...
research
06/15/2022

Feature Overcorrelation in Deep Graph Neural Networks: A New Perspective

Recent years have witnessed remarkable success achieved by graph neural ...
research
03/02/2021

Graph Information Vanishing Phenomenon inImplicit Graph Neural Networks

One of the key problems of GNNs is how to describe the importance of nei...
research
10/24/2022

Binary Graph Convolutional Network with Capacity Exploration

The current success of Graph Neural Networks (GNNs) usually relies on lo...
research
08/20/2020

Training Matters: Unlocking Potentials of Deeper Graph Convolutional Neural Networks

The performance limit of Graph Convolutional Networks (GCNs) and the fac...

Please sign up or login with your details

Forgot password? Click here to reset