DeepAI AI Chat
Log In Sign Up

On the Importance of Sampling in Learning Graph Convolutional Networks

by   Weilin Cong, et al.

Graph Convolutional Networks (GCNs) have achieved impressive empirical advancement across a wide variety of graph-related applications. Despite their great success, training GCNs on large graphs suffers from computational and memory issues. A potential path to circumvent these obstacles is sampling-based methods, where at each layer a subset of nodes is sampled. Although recent studies have empirically demonstrated the effectiveness of sampling-based methods, these works lack theoretical convergence guarantees under realistic settings and cannot fully leverage the information of evolving parameters during optimization. In this paper, we describe and analyze a general doubly variance reduction schema that can accelerate any sampling method under the memory budget. The motivating impetus for the proposed schema is a careful analysis for the variance of sampling methods where it is shown that the induced variance can be decomposed into node embedding approximation variance (zeroth-order variance) during forward propagation and layerwise-gradient variance (first-order variance) during backward propagation. We theoretically analyze the convergence of the proposed schema and show that it enjoys an π’ͺ(1/T) convergence rate. We complement our theoretical results by integrating the proposed schema in different sampling methods and applying them to different large real-world graphs. Code is public available atΒ <>.


page 1

page 2

page 3

page 4

βˆ™ 06/24/2020

Minimal Variance Sampling with Provable Guarantees for Fast Training of Graph Neural Networks

Sampling methods (e.g., node-wise, layer-wise, or subgraph) has become a...
βˆ™ 11/17/2019

Layer-Dependent Importance Sampling for Training Deep and Large Graph Convolutional Networks

Graph convolutional networks (GCNs) have recently received wide attentio...
βˆ™ 07/10/2019

GraphSAINT: Graph Sampling Based Inductive Learning Method

Graph Convolutional Networks (GCNs) are powerful models for learning rep...
βˆ™ 07/07/2022

Generalization Guarantee of Training Graph Convolutional Networks with Graph Topology Sampling

Graph convolutional networks (GCNs) have recently achieved great empiric...
βˆ™ 09/14/2018

Adaptive Sampling Towards Fast Graph Representation Learning

Graph Convolutional Networks (GCNs) have become a crucial tool on learni...
βˆ™ 01/30/2018

FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling

The graph convolutional networks (GCN) recently proposed by Kipf and Wel...
βˆ™ 02/28/2023

PA DA: Jointly Sampling PAth and DAta for Consistent NAS

Based on the weight-sharing mechanism, one-shot NAS methods train a supe...