Representation Power of Graph Convolutions : Neural Tangent Kernel Analysis

10/18/2022
by   Mahalakshmi Sabanayagam, et al.
0

The fundamental principle of Graph Neural Networks (GNNs) is to exploit the structural information of the data by aggregating the neighboring nodes using a graph convolution. Therefore, understanding its influence on the network performance is crucial. Convolutions based on graph Laplacian have emerged as the dominant choice with the symmetric normalization of the adjacency matrix A, defined as D^-1/2AD^-1/2, being the most widely adopted one, where D is the degree matrix. However, some empirical studies show that row normalization D^-1A outperforms it in node classification. Despite the widespread use of GNNs, there is no rigorous theoretical study on the representation power of these convolution operators, that could explain this behavior. In this work, we analyze the influence of the graph convolutions theoretically using Graph Neural Tangent Kernel in a semi-supervised node classification setting. Under a Degree Corrected Stochastic Block Model, we prove that: (i) row normalization preserves the underlying class structure better than other convolutions; (ii) performance degrades with network depth due to over-smoothing, but the loss in class information is the slowest in row normalization; (iii) skip connections retain the class information even at infinite depth, thereby eliminating over-smoothing. We finally validate our theoretical findings on real datasets.

READ FULL TEXT

page 27

page 28

research
06/16/2022

ResNorm: Tackling Long-tailed Degree Distribution Issue in Graph Neural Networks via Normalization

Graph Neural Networks (GNNs) have attracted much attention due to their ...
research
12/24/2022

Multi-duplicated Characterization of Graph Structures using Information Gain Ratio for Graph Neural Networks

Various graph neural networks (GNNs) have been proposed to solve node cl...
research
04/20/2022

Effects of Graph Convolutions in Deep Networks

Graph Convolutional Networks (GCNs) are one of the most popular architec...
research
09/24/2020

Learning Graph Normalization for Graph Neural Networks

Graph Neural Networks (GNNs) have attracted considerable attention and h...
research
07/12/2022

Tuning the Geometry of Graph Neural Networks

By recursively summing node features over entire neighborhoods, spatial ...
research
10/31/2021

Graph Tree Neural Networks

Graph neural networks (GNNs) have recently shown good performance in var...
research
05/31/2020

Doubly-Stochastic Normalization of the Gaussian Kernel is Robust to Heteroskedastic Noise

A fundamental step in many data-analysis techniques is the construction ...

Please sign up or login with your details

Forgot password? Click here to reset