On the Importance of Sampling in Learning Graph Convolutional Networks

03/03/2021
βˆ™
by   Weilin Cong, et al.
βˆ™
6
βˆ™

Graph Convolutional Networks (GCNs) have achieved impressive empirical advancement across a wide variety of graph-related applications. Despite their great success, training GCNs on large graphs suffers from computational and memory issues. A potential path to circumvent these obstacles is sampling-based methods, where at each layer a subset of nodes is sampled. Although recent studies have empirically demonstrated the effectiveness of sampling-based methods, these works lack theoretical convergence guarantees under realistic settings and cannot fully leverage the information of evolving parameters during optimization. In this paper, we describe and analyze a general doubly variance reduction schema that can accelerate any sampling method under the memory budget. The motivating impetus for the proposed schema is a careful analysis for the variance of sampling methods where it is shown that the induced variance can be decomposed into node embedding approximation variance (zeroth-order variance) during forward propagation and layerwise-gradient variance (first-order variance) during backward propagation. We theoretically analyze the convergence of the proposed schema and show that it enjoys an π’ͺ(1/T) convergence rate. We complement our theoretical results by integrating the proposed schema in different sampling methods and applying them to different large real-world graphs. Code is public available atΒ <https://github.com/CongWeilin/SGCN.git>.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
βˆ™ 06/24/2020

Minimal Variance Sampling with Provable Guarantees for Fast Training of Graph Neural Networks

Sampling methods (e.g., node-wise, layer-wise, or subgraph) has become a...
research
βˆ™ 11/17/2019

Layer-Dependent Importance Sampling for Training Deep and Large Graph Convolutional Networks

Graph convolutional networks (GCNs) have recently received wide attentio...
research
βˆ™ 07/10/2019

GraphSAINT: Graph Sampling Based Inductive Learning Method

Graph Convolutional Networks (GCNs) are powerful models for learning rep...
research
βˆ™ 07/07/2022

Generalization Guarantee of Training Graph Convolutional Networks with Graph Topology Sampling

Graph convolutional networks (GCNs) have recently achieved great empiric...
research
βˆ™ 09/14/2018

Adaptive Sampling Towards Fast Graph Representation Learning

Graph Convolutional Networks (GCNs) have become a crucial tool on learni...
research
βˆ™ 02/28/2023

PA DA: Jointly Sampling PAth and DAta for Consistent NAS

Based on the weight-sharing mechanism, one-shot NAS methods train a supe...
research
βˆ™ 06/02/2020

Convergence and Stability of Graph Convolutional Networks on Large Random Graphs

We study properties of Graph Convolutional Networks (GCNs) by analyzing ...

Please sign up or login with your details

Forgot password? Click here to reset