Any-dimensional equivariant neural networks

06/10/2023
by   Eitan Levin, et al.
0

Traditional supervised learning aims to learn an unknown mapping by fitting a function to a set of input-output pairs with a fixed dimension. The fitted function is then defined on inputs of the same dimension. However, in many settings, the unknown mapping takes inputs in any dimension; examples include graph parameters defined on graphs of any size and physics quantities defined on an arbitrary number of particles. We leverage a newly-discovered phenomenon in algebraic topology, called representation stability, to define equivariant neural networks that can be trained with data in a fixed dimension and then extended to accept inputs in any dimension. Our approach is user-friendly, requiring only the network architecture and the groups for equivariance, and can be combined with any training procedure. We provide a simple open-source implementation of our methods and offer preliminary numerical experiments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/30/2021

JacNet: Learning Functions with Structured Jacobians

Neural networks are trained to learn an approximate mapping from an inpu...
research
05/23/2023

Mind the spikes: Benign overfitting of kernels and neural networks in fixed dimension

The success of over-parameterized neural networks trained to near-zero t...
research
01/15/2022

Hyperplane bounds for neural feature mappings

Deep learning methods minimise the empirical risk using loss functions s...
research
05/07/2020

Model Reduction and Neural Networks for Parametric PDEs

We develop a general framework for data-driven approximation of input-ou...
research
07/11/2023

Fundamental limits of overparametrized shallow neural networks for supervised learning

We carry out an information-theoretical analysis of a two-layer neural n...
research
10/21/2019

Coercing Machine Learning to Output Physically Accurate Results

Many machine/deep learning artificial neural networks are trained to sim...
research
06/02/2022

Invertible Neural Networks for Graph Prediction

In this work, we address conditional generation using deep invertible ne...

Please sign up or login with your details

Forgot password? Click here to reset