A Non-Asymptotic Analysis of Oversmoothing in Graph Neural Networks

12/21/2022
by   Xinyi Wu, et al.
0

A central challenge of building more powerful Graph Neural Networks (GNNs) is the oversmoothing phenomenon, where increasing the network depth leads to homogeneous node representations and thus worse classification performance. While previous works have only demonstrated that oversmoothing is inevitable when the number of graph convolutions tends to infinity, in this paper, we precisely characterize the mechanism behind the phenomenon via a non-asymptotic analysis. Specifically, we distinguish between two different effects when applying graph convolutions – an undesirable mixing effect that homogenizes node representations in different classes, and a desirable denoising effect that homogenizes node representations in the same class. By quantifying these two effects on random graphs sampled from the Contextual Stochastic Block Model (CSBM), we show that oversmoothing happens once the mixing effect starts to dominate the denoising effect, and the number of layers required for this transition is O(log N/log (log N)) for sufficiently dense graphs with N nodes. We also extend our analysis to study the effects of Personalized PageRank (PPR) on oversmoothing. Our results suggest that while PPR mitigates oversmoothing at deeper layers, PPR-based architectures still achieve their best performance at a shallow depth and are outperformed by the graph convolution approach on certain graphs. Finally, we support our theoretical results with numerical experiments, which further suggest that the oversmoothing phenomenon observed in practice may be exacerbated by the difficulty of optimizing deep GNN models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/29/2022

How Powerful is Implicit Denoising in Graph Neural Networks

Graph Neural Networks (GNNs), which aggregate features from neighbors, a...
research
03/29/2021

RAN-GNNs: breaking the capacity limits of graph neural networks

Graph neural networks have become a staple in problems addressing learni...
research
12/02/2020

Deep Graph Neural Networks with Shallow Subgraph Samplers

While Graph Neural Networks (GNNs) are powerful models for learning repr...
research
06/01/2022

Graph Neural Networks with Precomputed Node Features

Most Graph Neural Networks (GNNs) cannot distinguish some graphs or inde...
research
06/22/2022

Agent-based Graph Neural Networks

We present a novel graph neural network we call AgentNet, which is desig...
research
04/20/2022

Effects of Graph Convolutions in Deep Networks

Graph Convolutional Networks (GCNs) are one of the most popular architec...
research
09/24/2021

Untrained Graph Neural Networks for Denoising

A fundamental problem in signal processing is to denoise a signal. While...

Please sign up or login with your details

Forgot password? Click here to reset