Rank Collapse Causes Over-Smoothing and Over-Correlation in Graph Neural Networks

08/31/2023
by   Andreas Roth, et al.
0

Our study reveals new theoretical insights into over-smoothing and feature over-correlation in deep graph neural networks. We show the prevalence of invariant subspaces, demonstrating a fixed relative behavior that is unaffected by feature transformations. Our work clarifies recent observations related to convergence to a constant state and a potential over-separation of node states, as the amplification of subspaces only depends on the spectrum of the aggregation function. In linear scenarios, this leads to node representations being dominated by a low-dimensional subspace with an asymptotic convergence rate independent of the feature transformations. This causes a rank collapse of the node representations, resulting in over-smoothing when smooth vectors span this subspace, and over-correlation even when over-smoothing is avoided. Guided by our theory, we propose a sum of Kronecker products as a beneficial property that can provably prevent over-smoothing, over-correlation, and rank collapse. We empirically extend our insights to the non-linear case, demonstrating the inability of existing models to capture linearly independent features.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2022

Not too little, not too much: a theoretical analysis of graph (over)smoothing

We analyze graph smoothing with mean aggregation, where each node succes...
research
03/20/2023

A Survey on Oversmoothing in Graph Neural Networks

Node features of graph neural networks (GNNs) tend to become more simila...
research
12/02/2019

Continuous Graph Neural Networks

This paper builds the connection between graph neural networks and tradi...
research
06/24/2021

Fea2Fea: Exploring Structural Feature Correlations via Graph Neural Networks

Structural features are important features in graph datasets. However, a...
research
06/23/2020

A Note on Over-Smoothing for Graph Neural Networks

Graph Neural Networks (GNNs) have achieved a lot of success on graph-str...
research
11/24/2015

Convergent Learning: Do different neural networks learn the same representations?

Recent success in training deep neural networks have prompted active inv...
research
03/03/2021

Wide Graph Neural Networks: Aggregation Provably Leads to Exponentially Trainability Loss

Graph convolutional networks (GCNs) and their variants have achieved gre...

Please sign up or login with your details

Forgot password? Click here to reset