Approximation spaces of deep neural networks

05/03/2019
by   Rémi Gribonval, et al.
0

We study the expressivity of deep neural networks. Measuring a network's complexity by its number of connections or by its number of neurons, we consider the class of functions for which the error of best approximation with networks of a given complexity decays at a certain rate when increasing the complexity budget. Using results from classical approximation theory, we show that this class can be endowed with a (quasi)-norm that makes it a linear function space, called approximation space. We establish that allowing the networks to have certain types of "skip connections" does not change the resulting approximation spaces. We also discuss the role of the network's nonlinearity (also known as activation function) on the resulting spaces, as well as the role of depth. For the popular ReLU nonlinearity and its powers, we relate the newly constructed spaces to classical Besov spaces. The established embeddings highlight that some functions of very low Besov smoothness can nevertheless be well approximated by neural networks, if these networks are sufficiently deep.

READ FULL TEXT
research
11/25/2022

Optimal Approximation Rates for Deep ReLU Neural Networks on Sobolev Spaces

We study the problem of how efficiently, in terms of the number of param...
research
10/28/2021

Sobolev-type embeddings for neural network approximation spaces

We consider neural network approximation spaces that classify functions ...
research
05/25/2020

Approximation in shift-invariant spaces with deep ReLU neural networks

We construct deep ReLU neural networks to approximate functions in dilat...
research
04/06/2021

Proof of the Theory-to-Practice Gap in Deep Learning via Sampling Complexity bounds for Neural Network Approximation Spaces

We study the computational complexity of (deterministic or randomized) a...
research
01/27/2021

Partition of unity networks: deep hp-approximation

Approximation theorists have established best-in-class optimal approxima...
research
01/28/2021

Approximation with Tensor Networks. Part III: Multivariate Approximation

We study the approximation of multivariate functions with tensor network...
research
05/07/2021

What Kinds of Functions do Deep Neural Networks Learn? Insights from Variational Spline Theory

We develop a variational framework to understand the properties of funct...

Please sign up or login with your details

Forgot password? Click here to reset