The gap between theory and practice in function approximation with deep neural networks

01/16/2020
by   Ben Adcock, et al.
0

Deep learning (DL) is transforming whole industries as complicated decision-making processes are being automated by Deep Neural Networks (DNNs) trained on real-world data. Driven in part by a rapidly-expanding literature on DNN approximation theory showing that DNNs can approximate a rich variety of functions, these tools are increasingly being considered for problems in scientific computing. Yet, unlike more traditional algorithms in this field, relatively little is known about DNNs from the principles of numerical analysis, namely, stability, accuracy, computational efficiency and sample complexity. In this paper we introduce a computational framework for examining DNNs in practice, and use it to study their empirical performance with regard to these issues. We examine the performance of DNNs of different widths and depths on a variety of test functions in various dimensions, including smooth and piecewise smooth functions. We also compare DL against best-in-class methods for smooth function approximation based on compressed sensing. Our main conclusion is that there is a crucial gap between the approximation theory of DNNs and their practical performance, with trained DNNs performing relatively poorly on functions for which there are strong approximation results (e.g. smooth functions), yet performing well in comparison to best-in-class methods for other functions. Finally, we present a novel practical existence theorem, which asserts the existence of a DNN architecture and training procedure which offers the same performance as current best-in-class schemes. This result indicates the potential for practical DNN approximation, and the need for future research into practical architecture design and training strategies.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/22/2022

Near-optimal learning of Banach-valued, high-dimensional functions via deep neural networks

The past decade has seen increasing interest in applying Deep Learning (...
research
12/11/2020

Deep Neural Networks Are Effective At Learning High-Dimensional Hilbert-Valued Functions From Limited Data

The accurate approximation of scalar-valued functions from sample points...
research
01/09/2021

SyReNN: A Tool for Analyzing Deep Neural Networks

Deep Neural Networks (DNNs) are rapidly gaining popularity in a variety ...
research
02/13/2018

Deep Neural Networks Learn Non-Smooth Functions Effectively

We theoretically discuss why deep neural networks (DNNs) performs better...
research
03/22/2023

Split-Et-Impera: A Framework for the Design of Distributed Deep Learning Applications

Many recent pattern recognition applications rely on complex distributed...
research
08/25/2022

CAS4DL: Christoffel Adaptive Sampling for function approximation via Deep Learning

The problem of approximating smooth, multivariate functions from sample ...
research
02/09/2023

Constrained Empirical Risk Minimization: Theory and Practice

Deep Neural Networks (DNNs) are widely used for their ability to effecti...

Please sign up or login with your details

Forgot password? Click here to reset