How Powerful are K-hop Message Passing Graph Neural Networks

05/26/2022
by   Jiarui Feng, et al.
0

The most popular design paradigm for Graph Neural Networks (GNNs) is 1-hop message passing – aggregating features from 1-hop neighbors repeatedly. However, the expressive power of 1-hop message passing is bounded by the Weisfeiler-Lehman (1-WL) test. Recently, researchers extended 1-hop message passing to K-hop message passing by aggregating information from K-hop neighbors of nodes simultaneously. However, there is no work on analyzing the expressive power of K-hop message passing. In this work, we theoretically characterize the expressive power of K-hop message passing. Specifically, we first formally differentiate two kinds of kernels of K-hop message passing which are often misused in previous works. We then characterize the expressive power of K-hop message passing by showing that it is more powerful than 1-hop message passing. Despite the higher expressive power, we show that K-hop message passing still cannot distinguish some simple regular graphs. To further enhance its expressive power, we introduce a KP-GNN framework, which improves K-hop message passing by leveraging the peripheral subgraph information in each hop. We prove that KP-GNN can distinguish almost all regular graphs including some distance regular graphs which could not be distinguished by previous distance encoding methods. Experimental results verify the expressive power and effectiveness of KP-GNN. KP-GNN achieves competitive results across all benchmark datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/05/2021

Improving the Expressive Power of Graph Neural Network with Tinhofer Algorithm

In recent years, Graph Neural Network (GNN) has bloomly progressed for i...
research
03/04/2021

Weisfeiler and Lehman Go Topological: Message Passing Simplicial Networks

The pairwise interaction paradigm of graph machine learning has predomin...
research
11/29/2021

Understanding over-squashing and bottlenecks on graphs via curvature

Most graph neural networks (GNNs) use the message passing paradigm, in w...
research
05/27/2019

Provably Powerful Graph Networks

Recently, the Weisfeiler-Lehman (WL) graph isomorphism test was used to ...
research
11/27/2022

Beyond 1-WL with Local Ego-Network Encodings

Identifying similar network structures is key to capture graph isomorphi...
research
06/20/2023

Provably Powerful Graph Neural Networks for Directed Multigraphs

This paper proposes a set of simple adaptations to transform standard me...
research
11/21/2022

From Node Interaction to Hop Interaction: New Effective and Scalable Graph Learning Paradigm

Existing Graph Neural Networks (GNNs) follow the message-passing mechani...

Please sign up or login with your details

Forgot password? Click here to reset