Distance Encoding – Design Provably More Powerful Graph Neural Networks for Structural Representation Learning

08/31/2020
by   Pan Li, et al.
42

Learning structural representations of node sets from graph-structured data is crucial for applications ranging from node-role discovery to link prediction and molecule classification. Graph Neural Networks (GNNs) have achieved great success in structural representation learning. However, most GNNs are limited by the 1-Weisfeiler-Lehman (WL) test and thus possible to generate identical representation for structures and graphs that are actually different. More powerful GNNs, proposed recently by mimicking higher-order-WL tests, only focus on entire-graph representations and cannot utilize sparsity of the graph structure to be computationally efficient. Here we propose a general class of structure-related features, termed Distance Encoding (DE), to assist GNNs in representing node sets with arbitrary sizes with strictly more expressive power than the 1-WL test. DE essentially captures the distance between the node set whose representation is to be learnt and each node in the graph, which includes important graph-related measures such as shortest-path-distance and generalized PageRank scores. We propose two general frameworks for GNNs to use DEs (1) as extra node attributes and (2) further as controllers of message aggregation in GNNs. Both frameworks may still utilize the sparse structure to keep scalability to process large graphs. In theory, we prove that these two frameworks can distinguish node sets embedded in almost all regular graphs where traditional GNNs always fail. We also rigorously analyze their limitations. Empirically, we evaluate these two frameworks on node structural roles prediction, link prediction and triangle prediction over six real networks. The results show that our models outperform GNNs without DEs by up-to 15 outperform other SOTA baselines particularly designed for those tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/22/2020

Revisiting graph neural networks and distance encoding in a practical view

Graph neural networks (GNNs) are widely used in the applications based o...
research
06/20/2022

Two-Dimensional Weisfeiler-Lehman Graph Neural Networks for Link Prediction

Link prediction is one important application of graph neural networks (G...
research
10/02/2020

The Surprising Power of Graph Neural Networks with Random Node Initialization

Graph neural networks (GNNs) are effective models for representation lea...
research
06/09/2023

Path Neural Networks: Expressive and Accurate Graph Neural Networks

Graph neural networks (GNNs) have recently become the standard approach ...
research
04/21/2023

What Do GNNs Actually Learn? Towards Understanding their Representations

In recent years, graph neural networks (GNNs) have achieved great succes...
research
10/28/2022

Generalized Laplacian Positional Encoding for Graph Representation Learning

Graph neural networks (GNNs) are the primary tool for processing graph-s...
research
12/22/2021

Investigating Neighborhood Modeling and Asymmetry Preservation in Digraph Representation Learning

Graph Neural Networks (GNNs) traditionally exhibit poor performance for ...

Please sign up or login with your details

Forgot password? Click here to reset