DeepAI
Log In Sign Up

Scalable Graph Neural Network Training: The Case for Sampling

05/05/2021
by   Marco Serafini, et al.
1

Graph Neural Networks (GNNs) are a new and increasingly popular family of deep neural network architectures to perform learning on graphs. Training them efficiently is challenging due to the irregular nature of graph data. The problem becomes even more challenging when scaling to large graphs that exceed the capacity of single devices. Standard approaches to distributed DNN training, such as data and model parallelism, do not directly apply to GNNs. Instead, two different approaches have emerged in the literature: whole-graph and sample-based training. In this paper, we review and compare the two approaches. Scalability is challenging with both approaches, but we make a case that research should focus on sample-based training since it is a more promising approach. Finally, we review recent systems supporting sample-based training.

READ FULL TEXT

page 1

page 2

page 3

page 4

04/16/2022

Theory of Graph Neural Networks: Representation and Learning

Graph Neural Networks (GNNs), neural network architectures targeted to l...
01/01/2022

Toward the Analysis of Graph Neural Networks

Graph Neural Networks (GNNs) have recently emerged as a robust framework...
07/03/2020

Scaling Graph Neural Networks with Approximate PageRank

Graph neural networks (GNNs) have emerged as a powerful approach for sol...
11/11/2022

Graph-Conditioned MLP for High-Dimensional Tabular Biomedical Data

Genome-wide studies leveraging recent high-throughput sequencing technol...
09/07/2020

GraphNorm: A Principled Approach to Accelerating Graph Neural Network Training

Normalization plays an important role in the optimization of deep neural...
11/06/2022

Unlearning Nonlinear Graph Classifiers in the Limited Training Data Regime

As the demand for user privacy grows, controlled data removal (machine u...