Rethinking the Expressive Power of GNNs via Graph Biconnectivity

01/23/2023
by   Bohang Zhang, et al.
0

Designing expressive Graph Neural Networks (GNNs) is a central topic in learning graph-structured data. While numerous approaches have been proposed to improve GNNs in terms of the Weisfeiler-Lehman (WL) test, generally there is still a lack of deep understanding of what additional power they can systematically and provably gain. In this paper, we take a fundamentally different perspective to study the expressive power of GNNs beyond the WL test. Specifically, we introduce a novel class of expressivity metrics via graph biconnectivity and highlight their importance in both theory and practice. As biconnectivity can be easily calculated using simple algorithms that have linear computational costs, it is natural to expect that popular GNNs can learn it easily as well. However, after a thorough review of prior GNN architectures, we surprisingly find that most of them are not expressive for any of these metrics. The only exception is the ESAN framework (Bevilacqua et al., 2022), for which we give a theoretical justification of its power. We proceed to introduce a principled and more efficient approach, called the Generalized Distance Weisfeiler-Lehman (GD-WL), which is provably expressive for all biconnectivity metrics. Practically, we show GD-WL can be implemented by a Transformer-like architecture that preserves expressiveness and enjoys full parallelizability. A set of experiments on both synthetic and real datasets demonstrates that our approach can consistently outperform prior GNN architectures.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/19/2022

Graph Neural Networks Are More Powerful Than we Think

Graph Neural Networks (GNNs) are powerful convolutional architectures th...
research
02/22/2023

Equivariant Polynomials for Graph Neural Networks

Graph Neural Networks (GNN) are inherently limited in their expressive p...
research
10/01/2021

Reconstruction for Powerful Graph Representations

Graph neural networks (GNNs) have limited expressive power, failing to r...
research
05/08/2023

From Relational Pooling to Subgraph GNNs: A Universal Framework for More Expressive Graph Neural Networks

Relational pooling is a framework for building more expressive and permu...
research
06/14/2020

From Graph Low-Rank Global Attention to 2-FWL Approximation

Graph Neural Networks (GNNs) are known to have an expressive power bound...
research
07/24/2023

Learning Resource Allocation Policy: Vertex-GNN or Edge-GNN?

Graph neural networks (GNNs) update the hidden representations of vertic...
research
02/14/2023

A Complete Expressiveness Hierarchy for Subgraph GNNs via Subgraph Weisfeiler-Lehman Tests

Recently, subgraph GNNs have emerged as an important direction for devel...

Please sign up or login with your details

Forgot password? Click here to reset