A Classification of G-invariant Shallow Neural Networks

05/18/2022
by   Devanshu Agrawal, et al.
0

When trying to fit a deep neural network (DNN) to a G-invariant target function with respect to a group G, it only makes sense to constrain the DNN to be G-invariant as well. However, there can be many different ways to do this, thus raising the problem of "G-invariant neural architecture design": What is the optimal G-invariant architecture for a given problem? Before we can consider the optimization problem itself, we must understand the search space, the architectures in it, and how they relate to one another. In this paper, we take a first step towards this goal; we prove a theorem that gives a classification of all G-invariant single-hidden-layer or "shallow" neural network (G-SNN) architectures with ReLU activation for any finite orthogonal group G. The proof is based on a correspondence of every G-SNN to a signed permutation representation of G acting on the hidden neurons. The classification is equivalently given in terms of the first cohomology classes of G, thus admitting a topological interpretation. Based on a code implementation, we enumerate the G-SNN architectures for some example groups G and visualize their structure. We draw the network morphisms between the enumerated architectures that can be leveraged during neural architecture search (NAS). Finally, we prove that architectures corresponding to inequivalent cohomology classes in a given cohomology ring coincide in function space only when their weight matrices are zero, and we discuss the implications of this in the context of NAS.

READ FULL TEXT

page 7

page 27

research
12/31/2019

Modeling Neural Architecture Search Methods for Deep Networks

There are many research works on the designing of architectures for the ...
research
03/08/2023

Densely Connected G-invariant Deep Neural Networks with Signed Permutation Representations

We introduce and investigate, for finite groups G, G-invariant deep neur...
research
08/17/2022

ObfuNAS: A Neural Architecture Search-based DNN Obfuscation Approach

Malicious architecture extraction has been emerging as a crucial concern...
research
04/23/2023

LayerNAS: Neural Architecture Search in Polynomial Complexity

Neural Architecture Search (NAS) has become a popular method for discove...
research
05/07/2023

RATs-NAS: Redirection of Adjacent Trails on GCN for Neural Architecture Search

Various hand-designed CNN architectures have been developed, such as VGG...
research
06/24/2020

Distribution-Based Invariant Deep Networks for Learning Meta-Features

Recent advances in deep learning from probability distributions enable t...
research
02/18/2020

A Computationally Efficient Neural Network Invariant to the Action of Symmetry Subgroups

We introduce a method to design a computationally efficient G-invariant ...

Please sign up or login with your details

Forgot password? Click here to reset