Influence-Based Mini-Batching for Graph Neural Networks

12/18/2022
by   Johannes Gasteiger, et al.
0

Using graph neural networks for large graphs is challenging since there is no clear way of constructing mini-batches. To solve this, previous methods have relied on sampling or graph clustering. While these approaches often lead to good training convergence, they introduce significant overhead due to expensive random data accesses and perform poorly during inference. In this work we instead focus on model behavior during inference. We theoretically model batch construction via maximizing the influence score of nodes on the outputs. This formulation leads to optimal approximation of the output when we do not have knowledge of the trained model. We call the resulting method influence-based mini-batching (IBMB). IBMB accelerates inference by up to 130x compared to previous methods that reach similar accuracy. Remarkably, with adaptive optimization and the right training schedule IBMB can also substantially accelerate training, thanks to precomputed batches and consecutive memory accesses. This results in up to 18x faster training per epoch and up to 17x faster convergence per runtime compared to previous methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/24/2020

Minimal Variance Sampling with Provable Guarantees for Fast Training of Graph Neural Networks

Sampling methods (e.g., node-wise, layer-wise, or subgraph) has become a...
research
02/17/2020

Ripple Walk Training: A Subgraph-based training framework for Large and Deep Graph Neural Network

Graph neural networks (GNNs) have achieved outstanding performance in le...
research
10/16/2021

Accelerating Training and Inference of Graph Neural Networks with Fast Sampling and Pipelining

Improving the training and inference performance of graph neural network...
research
06/11/2021

Global Neighbor Sampling for Mixed CPU-GPU Training on Giant Graphs

Graph neural networks (GNNs) are powerful tools for learning from graph ...
research
02/02/2023

LMC: Fast Training of GNNs via Subgraph Sampling with Provable Convergence

The message passing-based graph neural networks (GNNs) have achieved gre...
research
09/14/2022

Tuple Packing: Efficient Batching of Small Graphs in Graph Neural Networks

When processing a batch of graphs in machine learning models such as Gra...
research
06/17/2022

Sheaf Neural Networks with Connection Laplacians

A Sheaf Neural Network (SNN) is a type of Graph Neural Network (GNN) tha...

Please sign up or login with your details

Forgot password? Click here to reset