Invariant Layers for Graphs with Nodes of Different Types

02/27/2023
by   Dmitry Rybin, et al.
0

Neural networks that satisfy invariance with respect to input permutations have been widely studied in machine learning literature. However, in many applications, only a subset of all input permutations is of interest. For heterogeneous graph data, one can focus on permutations that preserve node types. We fully characterize linear layers invariant to such permutations. We verify experimentally that implementing these layers in graph neural network architectures allows learning important node interactions more effectively than existing techniques. We show that the dimension of space of these layers is given by a generalization of Bell numbers, extending the work (Maron et al., 2019). We further narrow the invariant network design space by addressing a question about the sizes of tensor layers necessary for function approximation on graph data. Our findings suggest that function approximation on a graph with n nodes can be done with tensors of sizes ≤ n, which is tighter than the best-known bound ≤ n(n-1)/2. For d × d image data with translation symmetry, our methods give a tight upper bound 2d - 1 (instead of d^4) on sizes of invariant tensor generators via a surprising connection to Davenport constants.

READ FULL TEXT
research
12/24/2018

Invariant and Equivariant Graph Networks

Invariant and equivariant networks have been successfully used for learn...
research
03/07/2021

Implementing graph neural networks with TensorFlow-Keras

Graph neural networks are a versatile machine learning architecture that...
research
04/26/2021

Invariant polynomials and machine learning

We present an application of invariant polynomials in machine learning. ...
research
07/13/2022

Graph Neural Network Bandits

We consider the bandit optimization problem with the reward function def...
research
01/18/2022

A Short Tutorial on The Weisfeiler-Lehman Test And Its Variants

Graph neural networks are designed to learn functions on graphs. Typical...
research
02/10/2018

Generalization of an Upper Bound on the Number of Nodes Needed to Achieve Linear Separability

An important issue in neural network research is how to choose the numbe...
research
05/28/2021

Symmetry-driven graph neural networks

Exploiting symmetries and invariance in data is a powerful, yet not full...

Please sign up or login with your details

Forgot password? Click here to reset