From Graph Low-Rank Global Attention to 2-FWL Approximation

06/14/2020
by   Omri Puny, et al.
0

Graph Neural Networks (GNNs) are known to have an expressive power bounded by that of the vertex coloring algorithm (Xu et al., 2019a; Morris et al., 2018). However, for rich node features, such a bound does not exist and GNNs can be shown to be universal, namely, have the theoretical ability to approximate arbitrary graph functions. It is well known, however, that expressive power alone does not imply good generalization. In an effort to improve generalization of GNNs we suggest the Low-Rank Global Attention (LRGA) module, taking advantage of the efficiency of low rank matrix-vector multiplication, that improves the algorithmic alignment (Xu et al., 2019b) of GNNs with the 2-folklore Weisfeiler-Lehman (FWL) algorithm; 2-FWL is a graph isomorphism algorithm that is strictly more powerful than vertex coloring. Concretely, we: (i) formulate 2-FWL using polynomial kernels; (ii) show LRGA aligns with this 2-FWL formulation; and (iii) bound the sample complexity of the kernel's feature map when learned with a randomly initialized two-layer MLP. The latter means the generalization error can be made arbitrarily small when training LRGA to learn the 2-FWL algorithm. From a practical point of view, augmenting existing GNN layers with LRGA produces state of the art results on most datasets in a GNN standard benchmark.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/22/2023

Equivariant Polynomials for Graph Neural Networks

Graph Neural Networks (GNN) are inherently limited in their expressive p...
research
05/24/2019

Approximation Ratios of Graph Neural Networks for Combinatorial Problems

In this paper, from a theoretical perspective, we study how powerful gra...
research
02/15/2021

Topological Graph Neural Networks

Graph neural networks (GNNs) are a powerful architecture for tackling gr...
research
01/23/2023

Rethinking the Expressive Power of GNNs via Graph Biconnectivity

Designing expressive Graph Neural Networks (GNNs) is a central topic in ...
research
05/15/2023

SKI to go Faster: Accelerating Toeplitz Neural Networks via Asymmetric Kernels

Toeplitz Neural Networks (TNNs) (Qin et. al. 2023) are a recent sequence...
research
01/30/2022

A Theoretical Comparison of Graph Neural Network Extensions

We study and compare different Graph Neural Network extensions that incr...
research
03/01/2021

Snowflake: Scaling GNNs to High-Dimensional Continuous Control via Parameter Freezing

Recent research has shown that Graph Neural Networks (GNNs) can learn po...

Please sign up or login with your details

Forgot password? Click here to reset