Lower bounds for artificial neural network approximations: A proof that shallow neural networks fail to overcome the curse of dimensionality

03/07/2021
by   Philipp Grohs, et al.
0

Artificial neural networks (ANNs) have become a very powerful tool in the approximation of high-dimensional functions. Especially, deep ANNs, consisting of a large number of hidden layers, have been very successfully used in a series of practical relevant computational problems involving high-dimensional input data ranging from classification tasks in supervised learning to optimal decision problems in reinforcement learning. There are also a number of mathematical results in the scientific literature which study the approximation capacities of ANNs in the context of high-dimensional target functions. In particular, there are a series of mathematical results in the scientific literature which show that sufficiently deep ANNs have the capacity to overcome the curse of dimensionality in the approximation of certain target function classes in the sense that the number of parameters of the approximating ANNs grows at most polynomially in the dimension d ∈ℕ of the target functions under considerations. In the proofs of several of such high-dimensional approximation results it is crucial that the involved ANNs are sufficiently deep and consist a sufficiently large number of hidden layers which grows in the dimension of the considered target functions. It is the topic of this work to look a bit more detailed to the deepness of the involved ANNs in the approximation of high-dimensional target functions. In particular, the main result of this work proves that there exists a concretely specified sequence of functions which can be approximated without the curse of dimensionality by sufficiently deep ANNs but which cannot be approximated without the curse of dimensionality if the involved ANNs are shallow or not deep enough.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/29/2021

Deep neural network approximation theory for high-dimensional functions

The purpose of this article is to develop machinery to study the capacit...
research
12/08/2020

High-dimensional approximation spaces of artificial neural networks and applications to partial differential equations

In this paper we develop a new machinery to study the capacity of artifi...
research
08/24/2017

On the Compressive Power of Deep Rectifier Networks for High Resolution Representation of Class Boundaries

This paper provides a theoretical justification of the superior classifi...
research
01/20/2020

Any Target Function Exists in a Neighborhood of Any Sufficiently Wide Random Network: A Geometrical Perspective

It is known that any target function is realized in a sufficiently small...
research
12/18/2021

The Kolmogorov Superposition Theorem can Break the Curse of Dimensionality When Approximating High Dimensional Functions

We explain how to use Kolmogorov's Superposition Theorem (KST) to overco...

Please sign up or login with your details

Forgot password? Click here to reset