(LA)yer-neigh(BOR) Sampling: Defusing Neighborhood Explosion in GNNs

10/24/2022
by   Muhammed Fatih Balin, et al.
0

Graph Neural Networks have recently received a significant attention, however, training them at a large scale still remains a challenge. Minibatch training coupled with sampling is used to alleviate this challenge. Even so existing approaches either suffer from the neighborhood explosion phenomenon or do not have good performance. To deal with these issues, we propose a new sampling algorithm called LAyer-neighBOR sampling (LABOR). It is designed to be a direct replacement for Neighborhood Sampling with the same fanout hyperparameter while sampling much fewer vertices, without sacrificing quality. By design, the variance of the estimator of each vertex matches Neighbor Sampling from the point of view of a single vertex. In our experiments, we demonstrate the superiority of our approach when it comes to model convergence behaviour against Neighbor Sampling and also the other Layer Sampling approaches under the same limited vertex sampling budget constraints.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/17/2019

Layer-Dependent Importance Sampling for Training Deep and Large Graph Convolutional Networks

Graph convolutional networks (GCNs) have recently received wide attentio...
research
02/03/2023

LazyGNN: Large-Scale Graph Neural Networks via Lazy Propagation

Recent works have demonstrated the benefits of capturing long-distance d...
research
05/04/2023

Communication-Efficient Graph Neural Networks with Probabilistic Neighborhood Expansion Analysis and Caching

Training and inference with graph neural networks (GNNs) on massive grap...
research
10/27/2021

VQ-GNN: A Universal Framework to Scale up Graph Neural Networks using Vector Quantization

Most state-of-the-art Graph Neural Networks (GNNs) can be defined as a f...
research
09/02/2022

Rethinking Efficiency and Redundancy in Training Large-scale Graphs

Large-scale graphs are ubiquitous in real-world scenarios and can be tra...
research
04/29/2019

Advancing GraphSAGE with A Data-Driven Node Sampling

As an efficient and scalable graph neural network, GraphSAGE has enabled...
research
06/10/2020

Bandit Samplers for Training Graph Neural Networks

Several sampling algorithms with variance reduction have been proposed f...

Please sign up or login with your details

Forgot password? Click here to reset