Decouple Graph Neural Networks: Train Multiple Simple GNNs Simultaneously Instead of One

04/20/2023
by   Hongyuan Zhang, et al.
0

Graph neural networks (GNN) suffer from severe inefficiency. It is mainly caused by the exponential growth of node dependency with the increase of layers. It extremely limits the application of stochastic optimization algorithms so that the training of GNN is usually time-consuming. To address this problem, we propose to decouple a multi-layer GNN as multiple simple modules for more efficient training, which is comprised of classical forward training (FT)and designed backward training (BT). Under the proposed framework, each module can be trained efficiently in FT by stochastic algorithms without distortion of graph information owing to its simplicity. To avoid the only unidirectional information delivery of FT and sufficiently train shallow modules with the deeper ones, we develop a backward training mechanism that makes the former modules perceive the latter modules. The backward training introduces the reversed information delivery into the decoupled modules as well as the forward information delivery. To investigate how the decoupling and greedy training affect the representational capacity, we theoretically prove that the error produced by linear modules will not accumulate on unsupervised tasks in most cases. The theoretical and experimental results show that the proposed framework is highly efficient with reasonable performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/30/2022

MLPInit: Embarrassingly Simple GNN Training Acceleration with MLP Initialization

Training graph neural networks (GNNs) on large graphs is complex and ext...
research
07/29/2022

BiFeat: Supercharge GNN Training via Graph Feature Quantization

Graph Neural Networks (GNNs) is a promising approach for applications wi...
research
11/23/2021

Network In Graph Neural Network

Graph Neural Networks (GNNs) have shown success in learning from graph s...
research
06/08/2020

Eigen-GNN: A Graph Structure Preserving Plug-in for GNNs

Graph Neural Networks (GNNs) are emerging machine learning models on gra...
research
03/02/2023

Boosting Distributed Full-graph GNN Training with Asynchronous One-bit Communication

Training Graph Neural Networks (GNNs) on large graphs is challenging due...
research
11/12/2022

Analysis of Graph Neural Networks with Theory of Markov Chains

In this paper, we provide a theoretical tool for the interpretation and ...
research
10/03/2022

Module-wise Training of Residual Networks via the Minimizing Movement Scheme

Greedy layer-wise or module-wise training of neural networks is compelli...

Please sign up or login with your details

Forgot password? Click here to reset