RAN-GNNs: breaking the capacity limits of graph neural networks

03/29/2021
by   Diego Valsesia, et al.
0

Graph neural networks have become a staple in problems addressing learning and analysis of data defined over graphs. However, several results suggest an inherent difficulty in extracting better performance by increasing the number of layers. Recent works attribute this to a phenomenon peculiar to the extraction of node features in graph-based tasks, i.e., the need to consider multiple neighborhood sizes at the same time and adaptively tune them. In this paper, we investigate the recently proposed randomly wired architectures in the context of graph neural networks. Instead of building deeper networks by stacking many layers, we prove that employing a randomly-wired architecture can be a more effective way to increase the capacity of the network and obtain richer representations. We show that such architectures behave like an ensemble of paths, which are able to merge contributions from receptive fields of varied size. Moreover, these receptive fields can also be modulated to be wider or narrower through the trainable weights over the paths. We also provide extensive experimental evidence of the superior performance of randomly wired architectures over multiple tasks and four graph convolution definitions, using recent benchmarking frameworks that addresses the reliability of previous testing methodologies.

READ FULL TEXT
research
01/29/2022

Rewiring with Positional Encodings for Graph Neural Networks

Several recent works use positional encodings to extend the receptive fi...
research
10/18/2020

Meta-path Free Semi-supervised Learning for Heterogeneous Networks

Graph neural networks (GNNs) have been widely used in representation lea...
research
12/21/2022

A Non-Asymptotic Analysis of Oversmoothing in Graph Neural Networks

A central challenge of building more powerful Graph Neural Networks (GNN...
research
08/25/2021

Tree Decomposed Graph Neural Network

Graph Neural Networks (GNNs) have achieved significant success in learni...
research
10/20/2021

SEA: Graph Shell Attention in Graph Neural Networks

A common issue in Graph Neural Networks (GNNs) is known as over-smoothin...
research
06/28/2022

Graph Condensation via Receptive Field Distribution Matching

Graph neural networks (GNNs) enable the analysis of graphs using deep le...
research
02/03/2018

GeniePath: Graph Neural Networks with Adaptive Receptive Paths

We present, GeniePath, a scalable approach for learning adaptive recepti...

Please sign up or login with your details

Forgot password? Click here to reset