Technical report: Graph Neural Networks go Grammatical

03/02/2023
by   Jason Piquenot, et al.
0

This paper proposes a new GNN design strategy. This strategy relies on Context-Free Grammars (CFG) generating the matrix language MATLANG. It enables us to ensure both WL-expressive power, substructure counting abilities and spectral properties. Applying our strategy, we design Grammatical Graph Neural Network G^2N^2, a provably 3-WL GNN able to count at edge-level cycles of length up to 6 and able to reach band-pass filters. A large number of experiments covering these properties corroborate the presented theoretical results.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/28/2019

GNN-FiLM: Graph Neural Networks with Feature-wise Linear Modulation

This paper presents a new Graph Neural Network (GNN) type using feature-...
research
01/22/2022

Overcoming Oversmoothness in Graph Convolutional Networks via Hybrid Scattering Networks

Geometric deep learning (GDL) has made great strides towards generalizin...
research
11/11/2021

DropGNN: Random Dropouts Increase the Expressiveness of Graph Neural Networks

This paper studies Dropout Graph Neural Networks (DropGNNs), a new appro...
research
02/11/2021

Large Scale Distributed Collaborative Unlabeled Motion Planning with Graph Policy Gradients

In this paper, we present a learning method to solve the unlabelled moti...
research
07/11/2023

A Modal Logic for Explaining some Graph Neural Networks

In this paper, we propose a modal logic in which counting modalities app...
research
08/04/2020

Graph Neural Networks with Low-rank Learnable Local Filters

For the classification of graph data consisting of features sampled on a...
research
01/25/2022

Convergence of Invariant Graph Networks

Although theoretical properties such as expressive power and over-smooth...

Please sign up or login with your details

Forgot password? Click here to reset