Generalization bounds for graph convolutional neural networks via Rademacher complexity

02/20/2021
by   Shaogao Lv, et al.
0

This paper aims at studying the sample complexity of graph convolutional networks (GCNs), by providing tight upper bounds of Rademacher complexity for GCN models with a single hidden layer. Under regularity conditions, theses derived complexity bounds explicitly depend on the largest eigenvalue of graph convolution filter and the degree distribution of the graph. Again, we provide a lower bound of Rademacher complexity for GCNs to show optimality of our derived upper bounds. Taking two commonly used examples as representatives, we discuss the implications of our results in designing graph convolution filters an graph distribution.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset