DeepAI AI Chat
Log In Sign Up

Neural Network Approximation of Lipschitz Functions in High Dimensions with Applications to Inverse Problems

by   Santhosh Karnik, et al.
Michigan State University

The remarkable successes of neural networks in a huge variety of inverse problems have fueled their adoption in disciplines ranging from medical imaging to seismic analysis over the past decade. However, the high dimensionality of such inverse problems has simultaneously left current theory, which predicts that networks should scale exponentially in the dimension of the problem, unable to explain why the seemingly small networks used in these settings work as well as they do in practice. To reduce this gap between theory and practice, a general method for bounding the complexity required for a neural network to approximate a Lipschitz function on a high-dimensional set with a low-complexity structure is provided herein. The approach is based on the observation that the existence of a linear Johnson-Lindenstrauss embedding ๐€โˆˆโ„^d ร— D of a given high-dimensional set ๐’ฎโŠ‚โ„^D into a low dimensional cube [-M,M]^d implies that for any Lipschitz function f : ๐’ฎโ†’โ„^p, there exists a Lipschitz function g : [-M,M]^d โ†’โ„^p such that g(๐€๐ฑ) = f(๐ฑ) for all ๐ฑโˆˆ๐’ฎ. Hence, if one has a neural network which approximates g : [-M,M]^d โ†’โ„^p, then a layer can be added which implements the JL embedding ๐€ to obtain a neural network which approximates f : ๐’ฎโ†’โ„^p. By pairing JL embedding results along with results on approximation of Lipschitz functions by neural networks, one then obtains results which bound the complexity required for a neural network to approximate Lipschitz functions on high dimensional sets. The end result is a general theoretical framework which can then be used to better explain the observed empirical successes of smaller networks in a wider variety of inverse problems than current theory allows.


page 1

page 2

page 3

page 4

โˆ™ 06/02/2022

Deep neural networks can stably solve high-dimensional, noisy, non-linear inverse problems

We study the problem of reconstructing solutions of inverse problems whe...
โˆ™ 03/23/2022

Stability properties for a class of inverse problems

We establish Lipschitz stability properties for a class of inverse probl...
โˆ™ 01/06/2020

An artificial neural network approximation for Cauchy inverse problems

A novel artificial neural network method is proposed for solving Cauchy ...
โˆ™ 03/06/2023

Expressivity of Shallow and Deep Neural Networks for Polynomial Approximation

We analyze the number of neurons that a ReLU neural network needs to app...
โˆ™ 02/26/2020

Numerical Solution of Inverse Problems by Weak Adversarial Networks

We consider a weak adversarial network approach to numerically solve a c...
โˆ™ 05/29/2022

Continuous Generative Neural Networks

In this work, we present and study Continuous Generative Neural Networks...
โˆ™ 09/07/2020

Stabilizing Invertible Neural Networks Using Mixture Models

In this paper, we analyze the properties of invertible neural networks, ...