The Surprising Power of Graph Neural Networks with Random Node Initialization

10/02/2020
by   Ralph Abboud, et al.
0

Graph neural networks (GNNs) are effective models for representation learning on graph-structured data. However, standard GNNs are limited in their expressive power, as they cannot distinguish graphs beyond the capability of the Weisfeiler-Leman (1-WL) graph isomorphism heuristic. This limitation motivated a large body of work, including higher-order GNNs, which are provably more powerful models. To date, higher-order invariant and equivariant networks are the only models with known universality results, but these results are practically hindered by prohibitive computational complexity. Thus, despite their limitations, standard GNNs are commonly used, due to their strong practical performance. In practice, GNNs have shown a promising performance when enhanced with random node initialization (RNI), where the idea is to train and run the models with randomized initial node features. In this paper, we analyze the expressive power of GNNs with RNI, and pose the following question: are GNNs with RNI more expressive than GNNs? We prove that this is indeed the case, by showing that GNNs with RNI are universal, a first such result for GNNs not relying on computationally demanding higher-order properties. We then empirically analyze the effect of RNI on GNNs, based on carefully constructed datasets. Our empirical findings support the superior performance of GNNs with RNI over standard GNNs. In fact, we demonstrate that the performance of GNNs with RNI is often comparable with or better than that of higher-order GNNs, while keeping the much lower memory requirements of standard GNNs. However, this improvement typically comes at the cost of slower model convergence. Somewhat surprisingly, we found that the convergence rate and the accuracy of the models can be improved by using only a partial random initialization regime.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/09/2020

A Survey on The Expressive Power of Graph Neural Networks

Graph neural networks (GNNs) are effective machine learning models for v...
research
12/06/2020

Counting Substructures with Higher-Order Graph Neural Networks: Possibility and Impossibility Results

While massage passing based Graph Neural Networks (GNNs) have become inc...
research
12/08/2021

Trainability for Universal GNNs Through Surgical Randomness

Message passing neural networks (MPNN) have provable limitations, which ...
research
03/17/2022

Graph Representation Learning with Individualization and Refinement

Graph Neural Networks (GNNs) have emerged as prominent models for repres...
research
06/08/2023

Hybrid Graph: A Unified Graph Representation with Datasets and Benchmarks for Complex Graphs

Graphs are widely used to encapsulate a variety of data formats, but rea...
research
08/31/2020

Distance Encoding – Design Provably More Powerful Graph Neural Networks for Structural Representation Learning

Learning structural representations of node sets from graph-structured d...
research
05/27/2022

Deep Ensembles for Graphs with Higher-order Dependencies

Graph neural networks (GNNs) continue to achieve state-of-the-art perfor...

Please sign up or login with your details

Forgot password? Click here to reset