Reducing Communication in Graph Neural Network Training

05/07/2020
by   Alok Tripathy, et al.
13

Graph Neural Networks (GNNs) are powerful and flexible neural networks that use the naturally sparse connectivity information of the data. GNNs represent this connectivity as sparse matrices, which have lower arithmetic intensity and thus higher communication costs compared to dense matrices, making GNNs harder to scale to high concurrencies than convolutional or fully-connected neural networks. We present a family of parallel algorithms for training GNNs. These algorithms are based on their counterparts in dense and sparse linear algebra, but they had not been previously applied to GNN training. We show that they can asymptotically reduce communication compared to existing parallel GNN training methods. We implement a promising and practical version that is based on 2D sparse-dense matrix multiplication using torch.distributed. Our implementation parallelizes over GPU-equipped clusters. We train GNNs on up to a hundred GPUs on datasets that include a protein network with over a billion edges.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/07/2020

GE-SpMM: General-purpose Sparse Matrix-Matrix Multiplication on GPUs for Graph Neural Networks

Graph Neural Networks (GNNs) have achieved significant improvements in v...
research
07/24/2020

Hierachial Protein Function Prediction with Tails-GNNs

Protein function prediction may be framed as predicting subgraphs (with ...
research
02/25/2023

RETEXO: Scalable Neural Network Training over Distributed Graphs

Graph neural networks offer a promising approach to supervised learning ...
research
04/25/2022

ClusterGNN: Cluster-based Coarse-to-Fine Graph Neural Network for Efficient Feature Matching

Graph Neural Networks (GNNs) with attention have been successfully appli...
research
03/15/2022

Distributed-Memory Sparse Kernels for Machine Learning

Sampled Dense Times Dense Matrix Multiplication (SDDMM) and Sparse Times...
research
10/19/2022

RSC: Accelerating Graph Neural Networks Training via Randomized Sparse Computations

The training of graph neural networks (GNNs) is extremely time consuming...
research
04/21/2021

Accelerating SpMM Kernel with Cache-First Edge Sampling for Graph Neural Networks

Graph neural networks (GNNs), an emerging deep learning model class, can...

Please sign up or login with your details

Forgot password? Click here to reset