Walk Message Passing Neural Networks and Second-Order Graph Neural Networks

06/16/2020
by   Floris Geerts, et al.
0

The expressive power of message passing neural networks (MPNNs) is known to match the expressive power of the 1-dimensional Weisfeiler-Leman graph (1-WL) isomorphism test. To boost the expressive power of MPNNs, a number of graph neural network architectures have recently been proposed based on higher-dimensional Weisfeiler-Leman tests. In this paper we consider the two-dimensional (2-WL) test and introduce a new type of MPNNs, referred to as ℓ-walk MPNNs, which aggregate features along walks of length ℓ between vertices. We show that 2-walk MPNNs match 2-WL in expressive power. More generally, ℓ-walk MPNNs, for any ℓ≥ 2, are shown to match the expressive power of the recently introduced ℓ-walk refinement procedure (W[ℓ]). Based on a correspondence between 2-WL and W[ℓ], we observe that ℓ-walk MPNNs and 2-walk MPNNs have the same expressive power, i.e., they can distinguish the same pairs of graphs, but ℓ-walk MPNNs can possibly distinguish pairs of graphs faster than 2-walk MPNNs. When it comes to concrete learnable graph neural network (GNN) formalisms that match 2-WL or W[ℓ] in expressive power, we consider second-order graph neural networks that allow for non-linear layers. In particular, to match W[ℓ] in expressive power, we allow ℓ-1 matrix multiplications in each layer. We propose different versions of second-order GNNs depending on the type of features (i.e., coming from a countable set, or coming from an uncountable set) as this affects the number of dimensions needed to represent the features. Our results indicate that increasing non-linearity in layers by means of allowing multiple matrix multiplications does not increase expressive power. At the very best, it results in a faster distinction of input graphs.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset