Training Matters: Unlocking Potentials of Deeper Graph Convolutional Neural Networks

08/20/2020
by   Sitao Luan, et al.
7

The performance limit of Graph Convolutional Networks (GCNs) and the fact that we cannot stack more of them to increase the performance, which we usually do for other deep learning paradigms, are pervasively thought to be caused by the limitations of the GCN layers, including insufficient expressive power, etc. However, if so, for a fixed architecture, it would be unlikely to lower the training difficulty and to improve performance by changing only the training procedure, which we show in this paper not only possible but possible in several ways. This paper first identify the training difficulty of GCNs from the perspective of graph signal energy loss. More specifically, we find that the loss of energy in the backward pass during training nullifies the learning of the layers closer to the input. Then, we propose several methodologies to mitigate the training problem by slightly modifying the GCN operator, from the energy perspective. After empirical validation, we confirm that these changes of operator lead to significant decrease in the training difficulties and notable performance boost, without changing the composition of parameters. With these, we conclude that the root cause of the problem is more likely the training difficulty than the others.

READ FULL TEXT
research
01/22/2018

Deeper Insights into Graph Convolutional Networks for Semi-Supervised Learning

Many interesting problems in machine learning are being revisited with n...
research
07/13/2020

Distributed Graph Convolutional Networks

The aim of this work is to develop a fully-distributed algorithmic frame...
research
07/25/2019

DropEdge: Towards the Very Deep Graph Convolutional Networks for Node Classification

Existing Graph Convolutional Networks (GCNs) are shallow---the number of...
research
03/30/2020

Revisiting "Over-smoothing" in Deep GCNs

Oversmoothing has been assumed to be the major cause of performance drop...
research
09/12/2019

GResNet: Graph Residual Network for Reviving Deep GNNs from Suspended Animation

The existing graph neural networks (GNNs) based on the spectral graph co...
research
05/27/2019

On Asymptotic Behaviors of Graph CNNs from Dynamical Systems Perspective

Graph Convolutional Neural Networks (graph CNNs) are a promising deep le...
research
05/24/2019

Power up! Robust Graph Convolutional Network against Evasion Attacks based on Graph Powering

Graph convolutional networks (GCNs) are powerful tools for graph-structu...

Please sign up or login with your details

Forgot password? Click here to reset