ωGNNs: Deep Graph Neural Networks Enhanced by Multiple Propagation Operators

10/31/2022
by   Moshe Eliasof, et al.
0

Graph Neural Networks (GNNs) are limited in their propagation operators. These operators often contain non-negative elements only and are shared across channels and layers, limiting the expressiveness of GNNs. Moreover, some GNNs suffer from over-smoothing, limiting their depth. On the other hand, Convolutional Neural Networks (CNNs) can learn diverse propagation filters, and phenomena like over-smoothing are typically not apparent in CNNs. In this paper, we bridge this gap by incorporating trainable channel-wise weighting factors ω to learn and mix multiple smoothing and sharpening propagation operators at each layer. Our generic method is called ωGNN, and we study two variants: ωGCN and ωGAT. For ωGCN, we theoretically analyse its behaviour and the impact of ω on the obtained node features. Our experiments confirm these findings, demonstrating and explaining how both variants do not over-smooth. Additionally, we experiment with 15 real-world datasets on node- and graph-classification tasks, where our ωGCN and ωGAT perform better or on par with state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/25/2022

Addressing Over-Smoothing in Graph Neural Networks via Deep Supervision

Learning useful node and graph representations with graph neural network...
research
03/20/2023

A Survey on Oversmoothing in Graph Neural Networks

Node features of graph neural networks (GNNs) tend to become more simila...
research
02/17/2023

G-Signatures: Global Graph Propagation With Randomized Signatures

Graph neural networks (GNNs) have evolved into one of the most popular d...
research
12/27/2021

Learn Layer-wise Connections in Graph Neural Networks

In recent years, Graph Neural Networks (GNNs) have shown superior perfor...
research
12/05/2022

Understanding the Relationship between Over-smoothing and Over-squashing in Graph Neural Networks

Graph Neural Networks (GNNs) have been successfully applied in many appl...
research
03/06/2023

Graph Positional Encoding via Random Feature Propagation

Two main families of node feature augmentation schemes have been explore...
research
10/18/2021

Channel redundancy and overlap in convolutional neural networks with channel-wise NNK graphs

Feature spaces in the deep layers of convolutional neural networks (CNNs...

Please sign up or login with your details

Forgot password? Click here to reset