Permutation Equivariant Neural Functionals

02/27/2023
by   Allan Zhou, et al.
0

This work studies the design of neural networks that can process the weights or gradients of other neural networks, which we refer to as neural functional networks (NFNs). Despite a wide range of potential applications, including learned optimization, processing implicit neural representations, network editing, and policy evaluation, there are few unifying principles for designing effective architectures that process the weights of other networks. We approach the design of neural functionals through the lens of symmetry, in particular by focusing on the permutation symmetries that arise in the weights of deep feedforward networks because hidden layer neurons have no inherent order. We introduce a framework for building permutation equivariant neural functionals, whose architectures encode these symmetries as an inductive bias. The key building blocks of this framework are NF-Layers (neural functional layers) that we constrain to be permutation equivariant through an appropriate parameter sharing scheme. In our experiments, we find that permutation equivariant neural functionals are effective on a diverse set of tasks that require processing the weights of MLPs and CNNs, such as predicting classifier generalization, producing "winning ticket" sparsity masks for initializations, and editing the weights of implicit neural representations (INRs). In addition, we provide code for our models and experiments at https://github.com/AllanYangZhou/nfn.

READ FULL TEXT
research
05/22/2023

Neural Functional Transformers

The recent success of neural networks as implicit representation of data...
research
01/30/2023

Equivariant Architectures for Learning in Deep Weight Spaces

Designing machine learning architectures for processing neural networks ...
research
06/09/2023

Hidden symmetries of ReLU networks

The parameter space for any fixed architecture of feedforward ReLU neura...
research
06/02/2023

Modularity based linkage model for neuroevolution

Crossover between neural networks is considered disruptive due to the st...
research
10/05/2020

Are Neural Nets Modular? Inspecting Functional Modularity Through Differentiable Weight Masks

Neural networks (NNs) whose subnetworks implement reusable functions are...
research
10/13/2022

Parameter-Efficient Masking Networks

A deeper network structure generally handles more complicated non-linear...
research
03/08/2023

Densely Connected G-invariant Deep Neural Networks with Signed Permutation Representations

We introduce and investigate, for finite groups G, G-invariant deep neur...

Please sign up or login with your details

Forgot password? Click here to reset