Not too little, not too much: a theoretical analysis of graph (over)smoothing

05/24/2022
by   Nicolas Keriven, et al.
0

We analyze graph smoothing with mean aggregation, where each node successively receives the average of the features of its neighbors. Indeed, it has quickly been observed that Graph Neural Networks (GNNs), which generally follow some variant of Message-Passing (MP) with repeated aggregation, may be subject to the oversmoothing phenomenon: by performing too many rounds of MP, the node features tend to converge to a non-informative limit. In the case of mean aggregation, for connected graphs, the node features become constant across the whole graph. At the other end of the spectrum, it is intuitively obvious that some MP rounds are necessary, but existing analyses do not exhibit both phenomena at once: beneficial “finite” smoothing and oversmoothing in the limit. In this paper, we consider simplified linear GNNs, and rigorously analyze two examples for which a finite number of mean aggregation steps provably improves the learning performance, before oversmoothing kicks in. We consider a latent space random graph model, where node features are partial observations of the latent variables and the graph contains pairwise relationships between them. We show that graph smoothing restores some of the lost information, up to a certain point, by two phenomenon: graph smoothing shrinks non-principal directions in the data faster than principal ones, which is useful for regression, and shrinks nodes within communities faster than they collapse together, which improves classification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/20/2022

LEReg: Empower Graph Neural Networks with Local Energy Regularization

Researches on analyzing graphs with Graph Neural Networks (GNNs) have be...
research
08/31/2023

Rank Collapse Causes Over-Smoothing and Over-Correlation in Graph Neural Networks

Our study reveals new theoretical insights into over-smoothing and featu...
research
05/27/2021

On the Universality of Graph Neural Networks on Large Random Graphs

We study the approximation power of Graph Neural Networks (GNNs) on late...
research
05/27/2022

Personalized PageRank Graph Attention Networks

There has been a rising interest in graph neural networks (GNNs) for rep...
research
12/07/2020

NCGNN: Node-level Capsule Graph Neural Network

Message passing has evolved as an effective tool for designing Graph Neu...
research
02/17/2023

G-Signatures: Global Graph Propagation With Randomized Signatures

Graph neural networks (GNNs) have evolved into one of the most popular d...
research
08/06/2022

Oversquashing in GNNs through the lens of information contraction and graph expansion

The quality of signal propagation in message-passing graph neural networ...

Please sign up or login with your details

Forgot password? Click here to reset