Redundancy-Free Computation Graphs for Graph Neural Networks

06/09/2019
by   Zhihao Jia, et al.
1

Graph Neural Networks (GNNs) are based on repeated aggregations of information across nodes' neighbors in a graph. However, because common neighbors are shared between different nodes, this leads to repeated and inefficient computations. We propose Hierarchically Aggregated computation Graphs (HAGs), a new GNN graph representation that explicitly avoids redundancy by managing intermediate aggregation results hierarchically, eliminating repeated computations and unnecessary data transfers in GNN training and inference. We introduce an accurate cost function to quantitatively evaluate the runtime performance of different HAGs and use a novel HAG search algorithm to find optimized HAGs. Experiments show that the HAG representation significantly outperforms the standard GNN graph representation by increasing the end-to-end training throughput by up to 2.8x and reducing the aggregations and data transfers in GNN training by up to 6.3x and 5.6x, while maintaining the original model accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/27/2022

Measuring and Improving the Use of Graph Information in Graph Neural Networks

Graph neural networks (GNNs) have been widely used for representation le...
research
04/15/2023

Temporal Aggregation and Propagation Graph Neural Networks for Dynamic Representation

Temporal graphs exhibit dynamic interactions between nodes over continuo...
research
01/23/2019

Constant Time Graph Neural Networks

Recent advancements in graph neural networks (GNN) have led to state-of-...
research
05/17/2023

SIMGA: A Simple and Effective Heterophilous Graph Neural Network with Efficient Global Aggregation

Graph neural networks (GNNs) realize great success in graph learning but...
research
10/17/2021

GNN-LM: Language Modeling based on Global Contexts via GNN

Inspired by the notion that “to copy is easier than to memorize“, in thi...
research
01/19/2022

Decoupling the Depth and Scope of Graph Neural Networks

State-of-the-art Graph Neural Networks (GNNs) have limited scalability w...
research
10/12/2021

Scalable Consistency Training for Graph Neural Networks via Self-Ensemble Self-Distillation

Consistency training is a popular method to improve deep learning models...

Please sign up or login with your details

Forgot password? Click here to reset