How Jellyfish Characterise Alternating Group Equivariant Neural Networks

01/24/2023
by   Edward Pearce-Crump, et al.
6

We provide a full characterisation of all of the possible alternating group (A_n) equivariant neural networks whose layers are some tensor power of ℝ^n. In particular, we find a basis of matrices for the learnable, linear, A_n-equivariant layer functions between such tensor power spaces in the standard basis of ℝ^n. We also describe how our approach generalises to the construction of neural networks that are equivariant to local symmetries.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/15/2023

Graph Automorphism Group Equivariant Neural Networks

For any graph G having n vertices and its automorphism group Aut(G), we ...
research
12/16/2022

Connecting Permutation Equivariant Neural Networks and Partition Diagrams

We show how the Schur-Weyl duality that exists between the partition alg...
research
02/28/2023

A multivariate Riesz basis of ReLU neural networks

We consider the trigonometric-like system of piecewise linear functions ...
research
04/27/2023

Categorification of Group Equivariant Neural Networks

We present a novel application of category theory for deep learning. We ...
research
05/27/2019

On the descriptive power of Neural-Networks as constrained Tensor Networks with exponentially large bond dimension

In many cases, neural networks can be mapped into tensor networks with a...
research
04/30/2023

Representing the Special Linear Group with Block Unitriangular Matrices

We prove that every element of the special linear group can be represent...
research
04/27/2023

An Algorithm for Computing with Brauer's Group Equivariant Neural Network Layers

The learnable, linear neural network layers between tensor power spaces ...

Please sign up or login with your details

Forgot password? Click here to reset