Proof of the Theory-to-Practice Gap in Deep Learning via Sampling Complexity bounds for Neural Network Approximation Spaces

04/06/2021
by   Philipp Grohs, et al.
0

We study the computational complexity of (deterministic or randomized) algorithms based on point samples for approximating or integrating functions that can be well approximated by neural networks. Such algorithms (most prominently stochastic gradient descent and its variants) are used extensively in the field of deep learning. One of the most important problems in this field concerns the question of whether it is possible to realize theoretically provable neural network approximation rates by such algorithms. We answer this question in the negative by proving hardness results for the problems of approximation and integration on a novel class of neural network approximation spaces. In particular, our results confirm a conjectured and empirically observed theory-to-practice gap in deep learning. We complement our hardness results by showing that approximation rates of a comparable order of convergence are (at least theoretically) achievable.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/30/2020

Approximation Rates for Neural Networks with Encodable Weights in Smoothness Spaces

We examine the necessary and sufficient complexity of neural networks to...
research
07/14/2021

Continuous vs. Discrete Optimization of Deep Neural Networks

Existing analyses of optimization in deep learning are either continuous...
research
05/03/2019

Approximation spaces of deep neural networks

We study the expressivity of deep neural networks. Measuring a network's...
research
10/03/2022

Limitations of neural network training due to numerical instability of backpropagation

We study the training of deep neural networks by gradient descent where ...
research
08/18/2020

When Hardness of Approximation Meets Hardness of Learning

A supervised learning algorithm has access to a distribution of labeled ...
research
05/31/2020

Neural Networks with Small Weights and Depth-Separation Barriers

In studying the expressiveness of neural networks, an important question...
research
12/20/2020

Recent advances in deep learning theory

Deep learning is usually described as an experiment-driven field under c...

Please sign up or login with your details

Forgot password? Click here to reset