On the Universality of Graph Neural Networks on Large Random Graphs

05/27/2021
by   Nicolas Keriven, et al.
0

We study the approximation power of Graph Neural Networks (GNNs) on latent position random graphs. In the large graph limit, GNNs are known to converge to certain "continuous" models known as c-GNNs, which directly enables a study of their approximation power on random graph models. In the absence of input node features however, just as GNNs are limited by the Weisfeiler-Lehman isomorphism test, c-GNNs will be severely limited on simple random graph models. For instance, they will fail to distinguish the communities of a well-separated Stochastic Block Model (SBM) with constant degree function. Thus, we consider recently proposed architectures that augment GNNs with unique node identifiers, referred to as Structural GNNs here (SGNNs). We study the convergence of SGNNs to their continuous counterpart (c-SGNNs) in the large random graph limit, under new conditions on the node identifiers. We then show that c-SGNNs are strictly more powerful than c-GNNs in the continuous limit, and prove their universality on several random graph models of interest, including most SBMs and a large class of random geometric graphs. Our results cover both permutation-invariant and permutation-equivariant architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/08/2020

Random Features Strengthen Graph Neural Networks

Graph neural networks (GNNs) are powerful machine learning models for va...
research
05/29/2019

On the equivalence between graph isomorphism testing and function approximation with GNNs

Graph neural networks (GNNs) have achieved lots of success on graph-stru...
research
05/24/2023

What functions can Graph Neural Networks compute on random graphs? The role of Positional Encoding

We aim to deepen the theoretical understanding of Graph Neural Networks ...
research
05/28/2022

Going Deeper into Permutation-Sensitive Graph Neural Networks

The invariance to permutations of the adjacency matrix, i.e., graph isom...
research
01/25/2023

Graph Neural Tangent Kernel: Convergence on Large Graphs

Graph neural networks (GNNs) achieve remarkable performance in graph mac...
research
09/05/2020

A Simple and General Graph Neural Network with Stochastic Message Passing

Graph neural networks (GNNs) are emerging machine learning models on gra...
research
05/24/2022

Not too little, not too much: a theoretical analysis of graph (over)smoothing

We analyze graph smoothing with mean aggregation, where each node succes...

Please sign up or login with your details

Forgot password? Click here to reset