Equivariant and Invariant Reynolds Networks

10/15/2021
by   Akiyoshi Sannai, et al.
0

Invariant and equivariant networks are useful in learning data with symmetry, including images, sets, point clouds, and graphs. In this paper, we consider invariant and equivariant networks for symmetries of finite groups. Invariant and equivariant networks have been constructed by various researchers using Reynolds operators. However, Reynolds operators are computationally expensive when the order of the group is large because they use the sum over the whole group, which poses an implementation difficulty. To overcome this difficulty, we consider representing the Reynolds operator as a sum over a subset instead of a sum over the whole group. We call such a subset a Reynolds design, and an operator defined by a sum over a Reynolds design a reductive Reynolds operator. For example, in the case of a graph with n nodes, the computational complexity of the reductive Reynolds operator is reduced to O(n^2), while the computational complexity of the Reynolds operator is O(n!). We construct learning models based on the reductive Reynolds operator called equivariant and invariant Reynolds networks (ReyNets) and prove that they have universal approximation property. Reynolds designs for equivariant ReyNets are derived from combinatorial observations with Young diagrams, while Reynolds designs for invariant ReyNets are derived from invariants called Reynolds dimensions defined on the set of invariant polynomials. Numerical experiments show that the performance of our models is comparable to state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/17/2018

On the computational complexity of MSTD sets

We outline a general algorithm for verifying whether a subset of the int...
research
02/23/2021

Operator preconditioning: the simplest case

Using the framework of operator or Calderón preconditioning, uniform pre...
research
01/27/2019

On the Universality of Invariant Networks

Constraining linear layers in neural networks to respect symmetry transf...
research
03/28/2023

Diffusion Maps for Group-Invariant Manifolds

In this article, we consider the manifold learning problem when the data...
research
05/13/2019

Universal Invariant and Equivariant Graph Neural Networks

Graph Neural Networks (GNN) come in many flavors, but should always be e...
research
05/23/2022

Variable-Input Deep Operator Networks

Existing architectures for operator learning require that the number and...
research
03/14/2022

Permutation Invariant Representations with Applications to Graph Deep Learning

This paper presents primarily two Euclidean embeddings of the quotient s...

Please sign up or login with your details

Forgot password? Click here to reset