Joint Edge-Model Sparse Learning is Provably Efficient for Graph Neural Networks

02/06/2023
by   Shuai Zhang, et al.
0

Due to the significant computational challenge of training large-scale graph neural networks (GNNs), various sparse learning techniques have been exploited to reduce memory and storage costs. Examples include graph sparsification that samples a subgraph to reduce the amount of data aggregation and model sparsification that prunes the neural network to reduce the number of trainable weights. Despite the empirical successes in reducing the training cost while maintaining the test accuracy, the theoretical generalization analysis of sparse learning for GNNs remains elusive. To the best of our knowledge, this paper provides the first theoretical characterization of joint edge-model sparse learning from the perspective of sample complexity and convergence rate in achieving zero generalization error. It proves analytically that both sampling important nodes and pruning neurons with the lowest-magnitude can reduce the sample complexity and improve convergence without compromising the test accuracy. Although the analysis is centered on two-layer GNNs with structural constraints on data, the insights are applicable to more general setups and justified by both synthetic and practical citation datasets.

READ FULL TEXT

page 9

page 29

page 30

page 31

research
12/07/2020

Learning Graph Neural Networks with Approximate Gradient Descent

The first provably efficient algorithm for learning graph neural network...
research
10/12/2021

Why Lottery Ticket Wins? A Theoretical Perspective of Sample Complexity on Pruned Neural Networks

The lottery ticket hypothesis (LTH) states that learning on a properly p...
research
06/09/2021

Scaling Up Graph Neural Networks Via Graph Coarsening

Scalability of graph neural networks remains one of the major challenges...
research
06/29/2021

Subgroup Generalization and Fairness of Graph Neural Networks

Despite enormous successful applications of graph neural networks (GNNs)...
research
05/14/2023

Towards Understanding the Generalization of Graph Neural Networks

Graph neural networks (GNNs) are the most widely adopted model in graph-...
research
11/16/2016

Net-Trim: Convex Pruning of Deep Neural Networks with Performance Guarantee

We introduce and analyze a new technique for model reduction for deep ne...
research
06/25/2020

Fast Learning of Graph Neural Networks with Guaranteed Generalizability: One-hidden-layer Case

Although graph neural networks (GNNs) have made great progress recently ...

Please sign up or login with your details

Forgot password? Click here to reset