Stability and Generalization of Graph Convolutional Neural Networks

05/03/2019
by   Saurabh Verma, et al.
14

Inspired by convolutional neural networks on 1D and 2D data, graph convolutional neural networks (GCNNs) have been developed for various learning tasks on graph data, and have shown superior performance on real-world datasets. Despite their success, there is a dearth of theoretical explorations of GCNN models such as their generalization properties. In this paper, we take a first step towards developing a deeper theoretical understanding of GCNN models by analyzing the stability of single-layer GCNN models and deriving their generalization guarantees in a semi-supervised graph learning setting. In particular, we show that the algorithmic stability of a GCNN model depends upon the largest absolute eigenvalue of its graph convolution filter. Moreover, to ensure the uniform stability needed to provide strong generalization guarantees, the largest absolute eigenvalue must be independent of the graph size. Our results shed new insights on the design of new & improved graph convolution filters with guaranteed algorithmic stability. We evaluate the generalization gap and stability on various real-world graph datasets and show that the empirical results indeed support our theoretical findings. To the best of our knowledge, we are the first to study stability bounds on graph learning in a semi-supervised setting and derive generalization bounds for GCNN models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/20/2021

Generalization bounds for graph convolutional neural networks via Rademacher complexity

This paper aims at studying the sample complexity of graph convolutional...
research
08/04/2023

Stability and Generalization of Hypergraph Collaborative Networks

Graph neural networks have been shown to be very effective in utilizing ...
research
01/31/2023

Semi-Supervised Classification with Graph Convolutional Kernel Machines

We present a deep Graph Convolutional Kernel Machine (GCKM) for semi-sup...
research
10/27/2020

Toward Better Generalization Bounds with Locally Elastic Stability

Classical approaches in learning theory are often seen to yield very loo...
research
10/28/2021

On Provable Benefits of Depth in Training Graph Convolutional Networks

Graph Convolutional Networks (GCNs) are known to suffer from performance...
research
09/06/2018

Wasserstein Soft Label Propagation on Hypergraphs: Algorithm and Generalization Error Bounds

Inspired by recent interests of developing machine learning and data min...
research
02/18/2021

Interpretable Stability Bounds for Spectral Graph Filters

Graph-structured data arise in a variety of real-world context ranging f...

Please sign up or login with your details

Forgot password? Click here to reset