Approximation capabilities of measure-preserving neural networks

06/21/2021
by   Aiqing Zhu, et al.
0

Measure-preserving neural networks are well-developed invertible models, however, the approximation capabilities remain unexplored. This paper rigorously establishes the general sufficient conditions for approximating measure-preserving maps using measure-preserving neural networks. It is shown that for compact U ⊂^D with D≥ 2, every measure-preserving map ψ: U→^D which is injective and bounded can be approximated in the L^p-norm by measure-preserving neural networks. Specifically, the differentiable maps with ± 1 determinants of Jacobians are measure-preserving, injective and bounded on U, thus hold the approximation property.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

01/29/2019

Approximation of functions by neural networks

We study the approximation of measurable functions on the hypercube by f...
07/08/2020

Approximation with Neural Networks in Variable Lebesgue Spaces

This paper concerns the universal approximation property with neural net...
12/29/2019

On Parity-Preserving Constrained Coding

Necessary and sufficient conditions are presented for the existence of f...
04/09/2019

Approximation in L^p(μ) with deep ReLU neural networks

We discuss the expressive power of neural networks which use the non-smo...
04/29/2022

VPNets: Volume-preserving neural networks for learning source-free dynamics

We propose volume-preserving networks (VPNets) for learning unknown sour...
11/21/2019

Volume-preserving Neural Networks: A Solution to the Vanishing Gradient Problem

We propose a novel approach to addressing the vanishing (or exploding) g...
09/19/2021

Locally-symplectic neural networks for learning volume-preserving dynamics

We propose locally-symplectic neural networks LocSympNets for learning v...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.