LMC: Fast Training of GNNs via Subgraph Sampling with Provable Convergence

02/02/2023
by   Zhihao Shi, et al.
3

The message passing-based graph neural networks (GNNs) have achieved great success in many real-world applications. However, training GNNs on large-scale graphs suffers from the well-known neighbor explosion problem, i.e., the exponentially increasing dependencies of nodes with the number of message passing layers. Subgraph-wise sampling methods – a promising class of mini-batch training techniques – discard messages outside the mini-batches in backward passes to avoid the neighbor explosion problem at the expense of gradient estimation accuracy. This poses significant challenges to their convergence analysis and convergence speeds, which seriously limits their reliable real-world applications. To address this challenge, we propose a novel subgraph-wise sampling method with a convergence guarantee, namely Local Message Compensation (LMC). To the best of our knowledge, LMC is the first subgraph-wise sampling method with provable convergence. The key idea of LMC is to retrieve the discarded messages in backward passes based on a message passing formulation of backward passes. By efficient and effective compensations for the discarded messages in both forward and backward passes, LMC computes accurate mini-batch gradients and thus accelerates convergence. We further show that LMC converges to first-order stationary points of GNNs. Experiments on large-scale benchmark tasks demonstrate that LMC significantly outperforms state-of-the-art subgraph-wise sampling methods in terms of efficiency.

READ FULL TEXT

page 5

page 9

research
10/27/2021

VQ-GNN: A Universal Framework to Scale up Graph Neural Networks using Vector Quantization

Most state-of-the-art Graph Neural Networks (GNNs) can be defined as a f...
research
11/30/2022

Towards Training GNNs using Explanation Directed Message Passing

With the increasing use of Graph Neural Networks (GNNs) in critical real...
research
06/24/2020

Minimal Variance Sampling with Provable Guarantees for Fast Training of Graph Neural Networks

Sampling methods (e.g., node-wise, layer-wise, or subgraph) has become a...
research
06/11/2021

Global Neighbor Sampling for Mixed CPU-GPU Training on Giant Graphs

Graph neural networks (GNNs) are powerful tools for learning from graph ...
research
05/25/2023

Union Subgraph Neural Networks

Graph Neural Networks (GNNs) are widely used for graph representation le...
research
05/25/2023

Neural incomplete factorization: learning preconditioners for the conjugate gradient method

In this paper, we develop a novel data-driven approach to accelerate sol...
research
12/18/2022

Influence-Based Mini-Batching for Graph Neural Networks

Using graph neural networks for large graphs is challenging since there ...

Please sign up or login with your details

Forgot password? Click here to reset