Weisfeiler–Lehman goes Dynamic: An Analysis of the Expressive Power of Graph Neural Networks for Attributed and Dynamic Graphs

10/08/2022
by   Silvia Beddar-Wiesing, et al.
0

Graph Neural Networks (GNNs) are a large class of relational models for graph processing. Recent theoretical studies on the expressive power of GNNs have focused on two issues. On the one hand, it has been proven that GNNs are as powerful as the Weisfeiler-Lehman test (1-WL) in their ability to distinguish graphs. Moreover, it has been shown that the equivalence enforced by 1-WL equals unfolding equivalence. On the other hand, GNNs turned out to be universal approximators on graphs modulo the constraints enforced by 1-WL/unfolding equivalence. However, these results only apply to Static Undirected Homogeneous Graphs with node attributes. In contrast, real-life applications often involve a variety of graph properties, such as, e.g., dynamics or node and edge attributes. In this paper, we conduct a theoretical analysis of the expressive power of GNNs for these two graph types that are particularly of interest. Dynamic graphs are widely used in modern applications, and its theoretical analysis requires new approaches. The attributed type acts as a standard form for all graph types since it has been shown that all graph types can be transformed without loss to Static Undirected Homogeneous Graphs with attributes on nodes and edges (SAUHG). The study considers generic GNN models and proposes appropriate 1-WL tests for those domains. Then, the results on the expressive power of GNNs are extended by proving that GNNs have the same capability as the 1-WL test in distinguishing dynamic and attributed graphs, the 1-WL equivalence equals unfolding equivalence and that GNNs are universal approximators modulo 1-WL/unfolding equivalence. Moreover, the proof of the approximation capability holds for SAUHGs, which include most of those used in practical applications, and it is constructive in nature allowing to deduce hints on the architecture of GNNs that can achieve the desired accuracy.

READ FULL TEXT
research
05/29/2019

On the equivalence between graph isomorphism testing and function approximation with GNNs

Graph neural networks (GNNs) have achieved lots of success on graph-stru...
research
05/27/2019

Incidence Networks for Geometric Deep Learning

One may represent a graph using both its node-edge and its node-node inc...
research
06/16/2021

A unifying point of view on expressive power of GNNs

Graph Neural Networks (GNNs) are a wide class of connectionist models fo...
research
04/06/2022

Graph Neural Networks Designed for Different Graph Types: A Survey

Graphs are ubiquitous in nature and can therefore serve as models for ma...
research
06/30/2023

Generalization Limits of Graph Neural Networks in Identity Effects Learning

Graph Neural Networks (GNNs) have emerged as a powerful tool for data-dr...
research
12/14/2020

Breaking the Expressive Bottlenecks of Graph Neural Networks

Recently, the Weisfeiler-Lehman (WL) graph isomorphism test was used to ...
research
05/31/2023

Explanations as Features: LLM-Based Features for Text-Attributed Graphs

Representation learning on text-attributed graphs (TAGs) has become a cr...

Please sign up or login with your details

Forgot password? Click here to reset