Densely Connected G-invariant Deep Neural Networks with Signed Permutation Representations

03/08/2023
by   Devanshu Agrawal, et al.
0

We introduce and investigate, for finite groups G, G-invariant deep neural network (G-DNN) architectures with ReLU activation that are densely connected – i.e., include all possible skip connections. In contrast to other G-invariant architectures in the literature, the preactivations of theG-DNNs presented here are able to transform by signed permutation representations (signed perm-reps) of G. Moreover, the individual layers of the G-DNNs are not required to be G-equivariant; instead, the preactivations are constrained to be G-equivariant functions of the network input in a way that couples weights across all layers. The result is a richer family of G-invariant architectures never seen previously. We derive an efficient implementation of G-DNNs after a reparameterization of weights, as well as necessary and sufficient conditions for an architecture to be "admissible" – i.e., nondegenerate and inequivalent to smaller architectures. We include code that allows a user to build a G-DNN interactively layer-by-layer, with the final architecture guaranteed to be admissible. Finally, we apply G-DNNs to two example problems – (1) multiplication in {-1, 1} (with theoretical guarantees) and (2) 3D object classification – finding that the inclusion of signed perm-reps significantly boosts predictive performance compared to baselines with only ordinary (i.e., unsigned) perm-reps.

READ FULL TEXT
research
05/18/2022

A Classification of G-invariant Shallow Neural Networks

When trying to fit a deep neural network (DNN) to a G-invariant target f...
research
01/29/2020

Constructing Deep Neural Networks with a Priori Knowledge of Wireless Tasks

Deep neural networks (DNNs) have been employed for designing wireless sy...
research
11/03/2021

On the Application of Data-Driven Deep Neural Networks in Linear and Nonlinear Structural Dynamics

The use of deep neural network (DNN) models as surrogates for linear and...
research
12/24/2019

An Analysis of the Expressiveness of Deep Neural Network Architectures Based on Their Lipschitz Constants

Deep neural networks (DNNs) have emerged as a popular mathematical tool ...
research
03/05/2020

Permute to Train: A New Dimension to Training Deep Neural Networks

We show that Deep Neural Networks (DNNs) can be efficiently trained by p...
research
10/18/2021

Permutation Invariance of Deep Neural Networks with ReLUs

Consider a deep neural network (DNN) that is being used to suggest the d...
research
02/27/2023

Permutation Equivariant Neural Functionals

This work studies the design of neural networks that can process the wei...

Please sign up or login with your details

Forgot password? Click here to reset