Expressivity of Deep Neural Networks

07/09/2020
by   Ingo Gühring, et al.
0

In this review paper, we give a comprehensive overview of the large variety of approximation results for neural networks. Approximation rates for classical function spaces as well as benefits of deep neural networks over shallow ones for specifically structured function classes are discussed. While the mainbody of existing results is for general feedforward architectures, we also depict approximation results for convolutional, residual and recurrent neural networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2020

Approximation in shift-invariant spaces with deep ReLU neural networks

We construct deep ReLU neural networks to approximate functions in dilat...
research
01/31/2021

The Connection Between Approximation, Depth Separation and Learnability in Neural Networks

Several recent works have shown separation results between deep neural n...
research
09/22/2022

Vanilla feedforward neural networks as a discretization of dynamic systems

Deep learning has made significant applications in the field of data sci...
research
06/30/2020

Deep neural networks for the evaluation and design of photonic devices

The data sciences revolution is poised to transform the way photonic sys...
research
09/27/2022

A Derivation of Feedforward Neural Network Gradients Using Fréchet Calculus

We present a derivation of the gradients of feedforward neural networks ...
research
07/08/2020

Approximation with Neural Networks in Variable Lebesgue Spaces

This paper concerns the universal approximation property with neural net...
research
08/07/2023

Tractability of approximation by general shallow networks

In this paper, we present a sharper version of the results in the paper ...

Please sign up or login with your details

Forgot password? Click here to reset