Equivariance with Learned Canonicalization Functions

11/11/2022
by   Sékou-Oumar Kaba, et al.
0

Symmetry-based neural networks often constrain the architecture in order to achieve invariance or equivariance to a group of transformations. In this paper, we propose an alternative that avoids this architectural constraint by learning to produce a canonical representation of the data. These canonicalization functions can readily be plugged into non-equivariant backbone architectures. We offer explicit ways to implement them for many groups of interest. We show that this approach enjoys universality while providing interpretable insights. Our main hypothesis is that learning a neural network to perform canonicalization is better than using predefined heuristics. Our results show that learning the canonicalization function indeed leads to better results and that the approach achieves excellent performance in practice.

READ FULL TEXT

page 8

page 17

research
02/01/2023

Generative Adversarial Symmetry Discovery

Despite the success of equivariant neural networks in scientific applica...
research
10/11/2022

Architectural Optimization over Subgroups for Equivariant Neural Networks

Incorporating equivariance to symmetry groups as a constraint during neu...
research
04/20/2021

Neural Networks for Learning Counterfactual G-Invariances from Single Environments

Despite – or maybe because of – their astonishing capacity to fit data, ...
research
05/30/2022

Testing for Geometric Invariance and Equivariance

Invariant and equivariant models incorporate the symmetry of an object t...
research
06/07/2021

Encoding Involutory Invariance in Neural Networks

In certain situations, Neural Networks (NN) are trained upon data that o...
research
09/14/2021

Nonlinearities in Steerable SO(2)-Equivariant CNNs

Invariance under symmetry is an important problem in machine learning. O...

Please sign up or login with your details

Forgot password? Click here to reset