Addressing Over-Smoothing in Graph Neural Networks via Deep Supervision

02/25/2022
by   Pantelis Elinas, et al.
0

Learning useful node and graph representations with graph neural networks (GNNs) is a challenging task. It is known that deep GNNs suffer from over-smoothing where, as the number of layers increases, node representations become nearly indistinguishable and model performance on the downstream task degrades significantly. To address this problem, we propose deeply-supervised GNNs (DSGNNs), i.e., GNNs enhanced with deep supervision where representations learned at all layers are used for training. We show empirically that DSGNNs are resilient to over-smoothing and can outperform competitive benchmarks on node and graph property prediction problems.

READ FULL TEXT
research
01/07/2023

Reducing Over-smoothing in Graph Neural Networks Using Relational Embeddings

Graph Neural Networks (GNNs) have achieved a lot of success with graph-s...
research
07/06/2021

Dirichlet Energy Constrained Learning for Deep Graph Neural Networks

Graph neural networks (GNNs) integrate deep architectures and topologica...
research
03/20/2023

A Survey on Oversmoothing in Graph Neural Networks

Node features of graph neural networks (GNNs) tend to become more simila...
research
09/02/2020

Self-supervised Smoothing Graph Neural Networks

This paper studies learning node representations with GNNs for unsupervi...
research
10/31/2022

ωGNNs: Deep Graph Neural Networks Enhanced by Multiple Propagation Operators

Graph Neural Networks (GNNs) are limited in their propagation operators....
research
09/06/2021

Pointspectrum: Equivariance Meets Laplacian Filtering for Graph Representation Learning

Graph Representation Learning (GRL) has become essential for modern grap...
research
06/03/2023

Scaling Up, Scaling Deep: Blockwise Graph Contrastive Learning

Oversmoothing is a common phenomenon in graph neural networks (GNNs), in...

Please sign up or login with your details

Forgot password? Click here to reset