Walk Message Passing Neural Networks and Second-Order Graph Neural Networks

06/16/2020
by   Floris Geerts, et al.
0

The expressive power of message passing neural networks (MPNNs) is known to match the expressive power of the 1-dimensional Weisfeiler-Leman graph (1-WL) isomorphism test. To boost the expressive power of MPNNs, a number of graph neural network architectures have recently been proposed based on higher-dimensional Weisfeiler-Leman tests. In this paper we consider the two-dimensional (2-WL) test and introduce a new type of MPNNs, referred to as ℓ-walk MPNNs, which aggregate features along walks of length ℓ between vertices. We show that 2-walk MPNNs match 2-WL in expressive power. More generally, ℓ-walk MPNNs, for any ℓ≥ 2, are shown to match the expressive power of the recently introduced ℓ-walk refinement procedure (W[ℓ]). Based on a correspondence between 2-WL and W[ℓ], we observe that ℓ-walk MPNNs and 2-walk MPNNs have the same expressive power, i.e., they can distinguish the same pairs of graphs, but ℓ-walk MPNNs can possibly distinguish pairs of graphs faster than 2-walk MPNNs. When it comes to concrete learnable graph neural network (GNN) formalisms that match 2-WL or W[ℓ] in expressive power, we consider second-order graph neural networks that allow for non-linear layers. In particular, to match W[ℓ] in expressive power, we allow ℓ-1 matrix multiplications in each layer. We propose different versions of second-order GNNs depending on the type of features (i.e., coming from a countable set, or coming from an uncountable set) as this affects the number of dimensions needed to represent the features. Our results indicate that increasing non-linearity in layers by means of allowing multiple matrix multiplications does not increase expressive power. At the very best, it results in a faster distinction of input graphs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/23/2020

The expressive power of kth-order invariant graph networks

The expressive power of graph neural network formalisms is commonly meas...
research
09/15/2021

RaWaNet: Enriching Graph Neural Network Input via Random Walks on Graphs

In recent years, graph neural networks (GNNs) have gained increasing pop...
research
05/27/2019

Provably Powerful Graph Networks

Recently, the Weisfeiler-Lehman (WL) graph isomorphism test was used to ...
research
07/11/2023

Weisfeiler and Lehman Go Measurement Modeling: Probing the Validity of the WL Test

The expressive power of graph neural networks is usually measured by com...
research
04/24/2019

PAN: Path Integral Based Convolution for Deep Graph Neural Networks

Convolution operations designed for graph-structured data usually utiliz...
research
06/06/2023

Fine-grained Expressivity of Graph Neural Networks

Numerous recent works have analyzed the expressive power of message-pass...
research
03/17/2022

On the expressive power of message-passing neural networks as global feature map transformers

We investigate the power of message-passing neural networks (MPNNs) in t...

Please sign up or login with your details

Forgot password? Click here to reset