On the Bottleneck of Graph Neural Networks and its Practical Implications

06/09/2020
by   Uri Alon, et al.
0

Graph neural networks (GNNs) were shown to effectively learn from highly structured data containing elements (nodes) with relationships (edges) between them. GNN variants differ in how each node in the graph absorbs the information flowing from its neighbor nodes. In this paper, we highlight an inherent problem in GNNs: the mechanism of propagating information between neighbors creates a bottleneck when every node aggregates messages from its neighbors. This bottleneck causes the over-squashing of exponentially-growing information into fixed-size vectors. As a result, the graph fails to propagate messages flowing from distant nodes and performs poorly when the prediction task depends on long-range information. We demonstrate that the bottleneck hinders popular GNNs from fitting the training data. We show that GNNs that absorb incoming edges equally, like GCN and GIN, are more susceptible to over-squashing than other GNN types. We further show that existing, extensively-tuned, GNN-based models suffer from over-squashing and that breaking the bottleneck improves state-of-the-art results without any hyperparameter tuning or additional weights.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2021

On Local Aggregation in Heterophilic Graphs

Many recent works have studied the performance of Graph Neural Networks ...
research
01/03/2022

Two-level Graph Neural Network

Graph Neural Networks (GNNs) are recently proposed neural network struct...
research
02/12/2021

Two Sides of the Same Coin: Heterophily and Oversmoothing in Graph Convolutional Neural Networks

Most graph neural networks (GNN) perform poorly in graphs where neighbor...
research
04/21/2021

Accelerating SpMM Kernel with Cache-First Edge Sampling for Graph Neural Networks

Graph neural networks (GNNs), an emerging deep learning model class, can...
research
09/28/2020

Graph Neural Networks with Heterophily

Graph Neural Networks (GNNs) have proven to be useful for many different...
research
05/20/2022

On the Prediction Instability of Graph Neural Networks

Instability of trained models, i.e., the dependence of individual node p...
research
02/20/2023

Finding Heterophilic Neighbors via Confidence-based Subgraph Matching for Semi-supervised Node Classification

Graph Neural Networks (GNNs) have proven to be powerful in many graph-ba...

Please sign up or login with your details

Forgot password? Click here to reset