DeepAI
Log In Sign Up

Distance Encoding – Design Provably More Powerful Graph Neural Networks for Structural Representation Learning

08/31/2020
by   Pan Li, et al.
42

Learning structural representations of node sets from graph-structured data is crucial for applications ranging from node-role discovery to link prediction and molecule classification. Graph Neural Networks (GNNs) have achieved great success in structural representation learning. However, most GNNs are limited by the 1-Weisfeiler-Lehman (WL) test and thus possible to generate identical representation for structures and graphs that are actually different. More powerful GNNs, proposed recently by mimicking higher-order-WL tests, only focus on entire-graph representations and cannot utilize sparsity of the graph structure to be computationally efficient. Here we propose a general class of structure-related features, termed Distance Encoding (DE), to assist GNNs in representing node sets with arbitrary sizes with strictly more expressive power than the 1-WL test. DE essentially captures the distance between the node set whose representation is to be learnt and each node in the graph, which includes important graph-related measures such as shortest-path-distance and generalized PageRank scores. We propose two general frameworks for GNNs to use DEs (1) as extra node attributes and (2) further as controllers of message aggregation in GNNs. Both frameworks may still utilize the sparse structure to keep scalability to process large graphs. In theory, we prove that these two frameworks can distinguish node sets embedded in almost all regular graphs where traditional GNNs always fail. We also rigorously analyze their limitations. Empirically, we evaluate these two frameworks on node structural roles prediction, link prediction and triangle prediction over six real networks. The results show that our models outperform GNNs without DEs by up-to 15 outperform other SOTA baselines particularly designed for those tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

11/22/2020

Revisiting graph neural networks and distance encoding in a practical view

Graph neural networks (GNNs) are widely used in the applications based o...
06/20/2022

Two-Dimensional Weisfeiler-Lehman Graph Neural Networks for Link Prediction

Link prediction is one important application of graph neural networks (G...
10/02/2020

The Surprising Power of Graph Neural Networks with Random Node Initialization

Graph neural networks (GNNs) are effective models for representation lea...
11/18/2019

GraLSP: Graph Neural Networks with Local Structural Patterns

It is not until recently that graph neural networks (GNNs) are adopted t...
10/28/2022

Generalized Laplacian Positional Encoding for Graph Representation Learning

Graph neural networks (GNNs) are the primary tool for processing graph-s...
06/10/2021

GraphiT: Encoding Graph Structure in Transformers

We show that viewing graphs as sets of node features and incorporating s...
12/22/2021

Investigating Neighborhood Modeling and Asymmetry Preservation in Digraph Representation Learning

Graph Neural Networks (GNNs) traditionally exhibit poor performance for ...

Code Repositories

distance-encoding

Distance Encoding for GNN Design


view repo