Revisiting "Over-smoothing" in Deep GCNs

03/30/2020
by   Chaoqi Yang, et al.
0

Oversmoothing has been assumed to be the major cause of performance drop in deep graph convolutional networks (GCNs). The evidence is usually derived from Simple Graph Convolution (SGC), a linear variant of GCNs. In this paper, we revisit graph node classification from an optimization perspective and argue that GCNs can actually learn anti-oversmoothing, whereas overfitting is the real obstacle in deep GCNs. This work interprets GCNs and SGCs as two-step optimization problems and provides the reason why deep SGC suffers from oversmoothing but deep GCNs do not. Our conclusion is compatible with the previous understanding of SGC, but we clarify why the same reasoning does not apply to GCNs. Based on our formulation, we provide more insights into the convolution operator and further propose a mean-subtraction trick to accelerate the training of deep GCNs. We verify our theory and propositions on three graph benchmarks. The experiments show that (i) in GCN, overfitting leads to the performance drop and oversmoothing does not exist even model goes to very deep (100 layers); (ii) mean-subtraction speeds up the model convergence as well as retains the same expressive power; (iii) the weight of neighbor averaging (1 is the common setting) does not significantly affect the model performance once it is above the threshold (> 0.5).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/22/2018

Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning

Many interesting problems in machine learning are being revisited with n...
research
08/22/2020

Tackling Over-Smoothing for General Graph Convolutional Networks

Increasing the depth of Graph Convolutional Networks (GCN), which in pri...
research
08/20/2020

Training Matters: Unlocking Potentials of Deeper Graph Convolutional Neural Networks

The performance limit of Graph Convolutional Networks (GCNs) and the fac...
research
05/04/2021

Multipath Graph Convolutional Neural Networks

Graph convolution networks have recently garnered a lot of attention for...
research
10/28/2021

On Provable Benefits of Depth in Training Graph Convolutional Networks

Graph Convolutional Networks (GCNs) are known to suffer from performance...
research
06/21/2023

Structure-Aware DropEdge Towards Deep Graph Convolutional Networks

It has been discovered that Graph Convolutional Networks (GCNs) encounte...

Please sign up or login with your details

Forgot password? Click here to reset