Generalization bounds for graph convolutional neural networks via Rademacher complexity

02/20/2021
by   Shaogao Lv, et al.
0

This paper aims at studying the sample complexity of graph convolutional networks (GCNs), by providing tight upper bounds of Rademacher complexity for GCN models with a single hidden layer. Under regularity conditions, theses derived complexity bounds explicitly depend on the largest eigenvalue of graph convolution filter and the degree distribution of the graph. Again, we provide a lower bound of Rademacher complexity for GCNs to show optimality of our derived upper bounds. Taking two commonly used examples as representatives, we discuss the implications of our results in designing graph convolution filters an graph distribution.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

05/03/2019

Stability and Generalization of Graph Convolutional Neural Networks

Inspired by convolutional neural networks on 1D and 2D data, graph convo...
06/10/2021

Simple Graph Convolutional Networks

Many neural networks for graphs are based on the graph convolution opera...
11/23/2018

On Filter Size in Graph Convolutional Networks

Recently, many researchers have been focusing on the definition of neura...
04/05/2012

Distribution-Dependent Sample Complexity of Large Margin Learning

We obtain a tight distribution-specific characterization of the sample c...
03/15/2022

Graph Neural Network Sensitivity Under Probabilistic Error Model

Graph convolutional networks (GCNs) can successfully learn the graph sig...
10/20/2021

The Performance of the MLE in the Bradley-Terry-Luce Model in ℓ_∞-Loss and under General Graph Topologies

The Bradley-Terry-Luce (BTL) model is a popular statistical approach for...
08/24/2021

Adaptive and Interpretable Graph Convolution Networks Using Generalized Pagerank

We investigate adaptive layer-wise graph convolution in deep GCN models....
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.