DeepAI AI Chat
Log In Sign Up

Improving the Expressive Power of Graph Neural Network with Tinhofer Algorithm

by   Alan J. X. Guo, et al.

In recent years, Graph Neural Network (GNN) has bloomly progressed for its power in processing graph-based data. Most GNNs follow a message passing scheme, and their expressive power is mathematically limited by the discriminative ability of the Weisfeiler-Lehman (WL) test. Following Tinhofer's research on compact graphs, we propose a variation of the message passing scheme, called the Weisfeiler-Lehman-Tinhofer GNN (WLT-GNN), that theoretically breaks through the limitation of the WL test. In addition, we conduct comparative experiments and ablation studies on several well-known datasets. The results show that the proposed methods have comparable performances and better expressive power on these datasets.


page 1

page 2

page 3

page 4


How Powerful are K-hop Message Passing Graph Neural Networks

The most popular design paradigm for Graph Neural Networks (GNNs) is 1-h...

KerGNNs: Interpretable Graph Neural Networks with Graph Kernels

Graph kernels are historically the most widely-used technique for graph ...

1-WL Expressiveness Is (Almost) All You Need

It has been shown that a message passing neural networks (MPNNs), a popu...

Weisfeiler and Leman Go Infinite: Spectral and Combinatorial Pre-Colorings

Graph isomorphism testing is usually approached via the comparison of gr...

A Biased Graph Neural Network Sampler with Near-Optimal Regret

Graph neural networks (GNN) have recently emerged as a vehicle for apply...

Provably Powerful Graph Networks

Recently, the Weisfeiler-Lehman (WL) graph isomorphism test was used to ...

GNNAutoScale: Scalable and Expressive Graph Neural Networks via Historical Embeddings

We present GNNAutoScale (GAS), a framework for scaling arbitrary message...