Group Equivariant Neural Architecture Search via Group Decomposition and Reinforcement Learning

04/10/2021
by   Sourya Basu, et al.
0

Recent works show that including group equivariance as an inductive bias improves neural network performance for both classification and generation tasks. Designing group-equivariant neural networks is, however, challenging when the group of interest is large and is unknown. Moreover, inducing equivariance can significantly reduce the number of independent parameters in a network with fixed feature size, affecting its overall performance. We address these problems by proving a new group-theoretic result in the context of equivariant neural networks that shows that a network is equivariant to a large group if and only if it is equivariant to smaller groups from which it is constructed. We also design an algorithm to construct equivariant networks that significantly improves computational complexity. Further, leveraging our theoretical result, we use deep Q-learning to search for group equivariant networks that maximize performance, in a significantly reduced search space than naive approaches, yielding what we call autoequivariant networks (AENs). To evaluate AENs, we construct and release new benchmark datasets, G-MNIST and G-Fashion-MNIST, obtained via group transformations on MNIST and Fashion-MNIST respectively. We show that AENs find the right balance between group equivariance and number of parameters, thereby consistently having good task performance.

READ FULL TEXT

page 10

page 11

page 12

page 13

research
06/16/2020

Fine-Tuning DARTS for Image Classification

Neural Architecture Search (NAS) has gained attraction due to superior c...
research
03/09/2021

Enhancing sensor resolution improves CNN accuracy given the same number of parameters or FLOPS

High image resolution is critical to obtain a good performance in many c...
research
07/18/2021

A Novel Evolutionary Algorithm for Hierarchical Neural Architecture Search

In this work, we propose a novel evolutionary algorithm for neural archi...
research
11/21/2020

Neural Group Testing to Accelerate Deep Learning

Recent advances in deep learning have made the use of large, deep neural...
research
05/25/2023

Towards Automatic Neural Architecture Search within General Super-Networks

Existing neural architecture search (NAS) methods typically rely on pre-...
research
11/27/2017

Context-modulation of hippocampal dynamics and deep convolutional networks

Complex architectures of biological neural circuits, such as parallel pr...
research
06/03/2018

An Aggressive Genetic Programming Approach for Searching Neural Network Structure Under Computational Constraints

Recently, there emerged revived interests of designing automatic program...

Please sign up or login with your details

Forgot password? Click here to reset