Arbitrary-Depth Universal Approximation Theorems for Operator Neural Networks

09/23/2021
by   Annan Yu, et al.
0

The standard Universal Approximation Theorem for operator neural networks (NNs) holds for arbitrary width and bounded depth. Here, we prove that operator NNs of bounded width and arbitrary depth are universal approximators for continuous nonlinear operators. In our main result, we prove that for non-polynomial activation functions that are continuously differentiable at a point with a nonzero derivative, one can construct an operator NN of width five, whose inputs are real numbers with finite decimal representations, that is arbitrarily close to any given continuous nonlinear operator. We derive an analogous result for non-affine polynomial activation functions. We also show that depth has theoretical advantages by constructing operator ReLU NNs of depth 2k^3+8 and constant width that cannot be well-approximated by any operator ReLU NN of depth k, unless its width is exponential in k.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/21/2019

Universal Approximation with Deep Narrow Networks

The classical Universal Approximation Theorem certifies that the univers...
research
09/19/2023

Minimum width for universal approximation using ReLU networks on compact domain

The universal approximation property of width-bounded networks has been ...
research
03/25/2022

Qualitative neural network approximation over R and C: Elementary proofs for analytic and polynomial activation

In this article, we prove approximation theorems in classes of deep and ...
research
06/18/2021

The Principles of Deep Learning Theory

This book develops an effective theory approach to understanding deep ne...
research
08/09/2017

Universal Function Approximation by Deep Neural Nets with Bounded Width and ReLU Activations

This article concerns the expressive power of depth in neural nets with ...
research
04/24/2022

Piecewise-Linear Activations or Analytic Activation Functions: Which Produce More Expressive Neural Networks?

Many currently available universal approximation theorems affirm that de...
research
02/23/2022

On the Omnipresence of Spurious Local Minima in Certain Neural Network Training Problems

We study the loss landscape of training problems for deep artificial neu...

Please sign up or login with your details

Forgot password? Click here to reset