Universal approximations of invariant maps by neural networks

04/26/2018
by   Dmitry Yarotsky, et al.
0

We describe generalizations of the universal approximation theorem for neural networks to maps invariant or equivariant with respect to linear representations of groups. Our goal is to establish network-like computational models that are both invariant/equivariant and provably complete in the sense of their ability to approximate any continuous invariant/equivariant map. Our contribution is three-fold. First, in the general case of compact groups we propose a construction of a complete invariant/equivariant network using an intermediate polynomial layer. We invoke classical theorems of Hilbert and Weyl to justify and simplify this construction; in particular, we describe an explicit complete ansatz for approximation of permutation-invariant maps. Second, we consider groups of translations and prove several versions of the universal approximation theorem for convolutional networks in the limit of continuous signals on euclidean spaces. Finally, we consider 2D signal transformations equivariant with respect to the group SE(2) of rigid euclidean motions. In this case we introduce the "charge--conserving convnet" -- a convnet-like computational model based on the decomposition of the feature space into isotypic representations of SO(2). We prove this model to be a universal approximator for continuous SE(2)--equivariant signal transformations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/05/2019

Universal approximations of permutation invariant/equivariant functions by deep neural networks

In this paper,we develop a theory of the relationship between permutatio...
research
12/27/2020

Universal Approximation Theorem for Equivariant Maps by Group CNNs

Group symmetry is inherent in a wide variety of data distributions. Data...
research
09/29/2022

Equivariant maps from invariant functions

In equivariant machine learning the idea is to restrict the learning to ...
research
01/27/2019

On the Universality of Invariant Networks

Constraining linear layers in neural networks to respect symmetry transf...
research
09/07/2022

Bispectral Neural Networks

We present a novel machine learning architecture, Bispectral Neural Netw...
research
10/02/2022

Deep Invertible Approximation of Topologically Rich Maps between Manifolds

How can we design neural networks that allow for stable universal approx...
research
04/24/2023

A Transfer Principle: Universal Approximators Between Metric Spaces From Euclidean Universal Approximators

We build universal approximators of continuous maps between arbitrary Po...

Please sign up or login with your details

Forgot password? Click here to reset