From Spectral Graph Convolutions to Large Scale Graph Convolutional Networks

07/12/2022
by   Matteo Bunino, et al.
0

Graph Convolutional Networks (GCNs) have been shown to be a powerful concept that has been successfully applied to a large variety of tasks across many domains over the past years. In this work we study the theory that paved the way to the definition of GCN, including related parts of classical graph theory. We also discuss and experimentally demonstrate key properties and limitations of GCNs such as those caused by the statistical dependency of samples, introduced by the edges of the graph, which causes the estimates of the full gradient to be biased. Another limitation we discuss is the negative impact of minibatch sampling on the model performance. As a consequence, during parameter update, gradients are computed on the whole dataset, undermining scalability to large graphs. To account for this, we research alternative methods which allow to safely learn good parameters while sampling only a subset of data per iteration. We reproduce the results reported in the work of Kipf et al. and propose an implementation inspired to SIGN, which is a sampling-free minibatch method. Eventually we compare the two implementations on a benchmark dataset, proving that they are comparable in terms of prediction accuracy for the task of semi-supervised node classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/10/2021

Sampling methods for efficient training of graph convolutional networks: A survey

Graph Convolutional Networks (GCNs) have received significant attention ...
research
07/21/2019

Spectral-based Graph Convolutional Network for Directed Graphs

Graph convolutional networks(GCNs) have become the most popular approach...
research
06/05/2019

Variational Spectral Graph Convolutional Networks

We propose a Bayesian approach to spectral graph convolutional networks ...
research
08/03/2020

Pseudoinverse Graph Convolutional Networks: Fast Filters Tailored for Large Eigengaps of Dense Graphs and Hypergraphs

Graph Convolutional Networks (GCNs) have proven to be successful tools f...
research
02/20/2021

SSFG: Stochastically Scaling Features and Gradients for Regularizing Graph Convolution Networks

Graph convolutional networks have been successfully applied in various g...
research
01/30/2018

FastGCN: Fast Learning with Graph Convolutional Networks via Importance Sampling

The graph convolutional networks (GCN) recently proposed by Kipf and Wel...
research
01/15/2020

Overly Optimistic Prediction Results on Imbalanced Data: Flaws and Benefits of Applying Over-sampling

Information extracted from electrohysterography recordings could potenti...

Please sign up or login with your details

Forgot password? Click here to reset