Distance-Restricted Folklore Weisfeiler-Leman GNNs with Provable Cycle Counting Power

09/10/2023
by   Junru Zhou, et al.
0

The ability of graph neural networks (GNNs) to count certain graph substructures, especially cycles, is important for the success of GNNs on a wide range of tasks. It has been recently used as a popular metric for evaluating the expressive power of GNNs. Many of the proposed GNN models with provable cycle counting power are based on subgraph GNNs, i.e., extracting a bag of subgraphs from the input graph, generating representations for each subgraph, and using them to augment the representation of the input graph. However, those methods require heavy preprocessing, and suffer from high time and memory costs. In this paper, we overcome the aforementioned limitations of subgraph GNNs by proposing a novel class of GNNs – d-Distance-Restricted FWL(2) GNNs, or d-DRFWL(2) GNNs. d-DRFWL(2) GNNs use node pairs whose mutual distances are at most d as the units for message passing to balance the expressive power and complexity. By performing message passing among distance-restricted node pairs in the original graph, d-DRFWL(2) GNNs avoid the expensive subgraph extraction operations in subgraph GNNs, making both the time and space complexity lower. We theoretically show that the discriminative power of d-DRFWL(2) GNNs strictly increases as d increases. More importantly, d-DRFWL(2) GNNs have provably strong cycle counting power even with d=2: they can count all 3, 4, 5, 6-cycles. Since 6-cycles (e.g., benzene rings) are ubiquitous in organic molecules, being able to detect and count them is crucial for achieving robust and generalizable performance on molecular tasks. Experiments on both synthetic datasets and molecular datasets verify our theory. To the best of our knowledge, our model is the most efficient GNN model to date (both theoretically and empirically) that can count up to 6-cycles.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/19/2023

Efficiently Counting Substructures by Subgraph GNNs without Running GNN on Subgraphs

Using graph neural networks (GNNs) to approximate specific functions suc...
research
08/16/2023

Expressivity of Graph Neural Networks Through the Lens of Adversarial Robustness

We perform the first adversarial robustness study into Graph Neural Netw...
research
06/16/2020

Improving Graph Neural Network Expressivity via Subgraph Isomorphism Counting

While Graph Neural Networks (GNNs) have achieved remarkable results in a...
research
02/07/2023

Learning to Count Isomorphisms with Graph Neural Networks

Subgraph isomorphism counting is an important problem on graphs, as many...
research
05/31/2023

Improving Expressivity of Graph Neural Networks using Localization

In this paper, we propose localized versions of Weisfeiler-Leman (WL) al...
research
02/10/2020

Can graph neural networks count substructures?

The ability to detect and count certain substructures in graphs is impor...
research
12/08/2021

Trainability for Universal GNNs Through Surgical Randomness

Message passing neural networks (MPNN) have provable limitations, which ...

Please sign up or login with your details

Forgot password? Click here to reset