Architectural Optimization over Subgroups for Equivariant Neural Networks

10/11/2022
by   Kaitlin Maile, et al.
0

Incorporating equivariance to symmetry groups as a constraint during neural network training can improve performance and generalization for tasks exhibiting those symmetries, but such symmetries are often not perfectly nor explicitly present. This motivates algorithmically optimizing the architectural constraints imposed by equivariance. We propose the equivariance relaxation morphism, which preserves functionality while reparameterizing a group equivariant layer to operate with equivariance constraints on a subgroup, as well as the [G]-mixed equivariant layer, which mixes layers constrained to different groups to enable within-layer equivariance optimization. We further present evolutionary and differentiable neural architecture search (NAS) algorithms that utilize these mechanisms respectively for equivariance-aware architectural optimization. Experiments across a variety of datasets show the benefit of dynamically constrained equivariance to find effective architectures with approximate equivariance.

READ FULL TEXT

page 8

page 14

page 15

page 16

research
12/30/2019

RC-DARTS: Resource Constrained Differentiable Architecture Search

Recent advances show that Neural Architectural Search (NAS) method is ab...
research
05/22/2020

An Introduction to Neural Architecture Search for Convolutional Networks

Neural Architecture Search (NAS) is a research field concerned with util...
research
11/11/2022

Equivariance with Learned Canonicalization Functions

Symmetry-based neural networks often constrain the architecture in order...
research
06/01/2023

Regularizing Towards Soft Equivariance Under Mixed Symmetries

Datasets often have their intrinsic symmetries, and particular deep-lear...
research
06/22/2021

On Constrained Optimization in Differentiable Neural Architecture Search

Differentiable Architecture Search (DARTS) is a recently proposed neural...
research
03/31/2021

NetAdaptV2: Efficient Neural Architecture Search with Fast Super-Network Training and Architecture Optimization

Neural architecture search (NAS) typically consists of three main steps:...
research
09/01/2020

Developing Constrained Neural Units Over Time

In this paper we present a foundational study on a constrained method th...

Please sign up or login with your details

Forgot password? Click here to reset