Limitations on approximation by deep and shallow neural networks

11/30/2022
by   Guergana Petrova, et al.
0

We prove Carl's type inequalities for the error of approximation of compact sets K by deep and shallow neural networks. This in turn gives lower bounds on how well we can approximate the functions in K when requiring the approximants to come from outputs of such networks. Our results are obtained as a byproduct of the study of the recently introduced Lipschitz widths.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/03/2016

Error bounds for approximations with deep ReLU networks

We study expressive power of shallow and deep neural networks with piece...
research
03/06/2023

Expressivity of Shallow and Deep Neural Networks for Polynomial Approximation

We analyze the number of neurons that a ReLU neural network needs to app...
research
07/28/2023

Optimal Approximation of Zonoids and Uniform Approximation by Shallow Neural Networks

We study the following two related problems. The first is to determine t...
research
03/29/2023

Optimal approximation of C^k-functions using shallow complex-valued neural networks

We prove a quantitative result for the approximation of functions of reg...
research
08/07/2023

Tractability of approximation by general shallow networks

In this paper, we present a sharper version of the results in the paper ...
research
08/26/2019

Dimension independent bounds for general shallow networks

This paper proves an abstract theorem addressing in a unified manner two...
research
12/10/2020

The Representation Power of Neural Networks: Breaking the Curse of Dimensionality

In this paper, we analyze the number of neurons and training parameters ...

Please sign up or login with your details

Forgot password? Click here to reset