Analysis of Deep Neural Networks with Quasi-optimal polynomial approximation rates

12/04/2019
by   Joseph Daws, et al.
0

We show the existence of a deep neural network capable of approximating a wide class of high-dimensional approximations. The construction of the proposed neural network is based on a quasi-optimal polynomial approximation. We show that this network achieves an error rate that is sub-exponential in the number of polynomial functions, M, used in the polynomial approximation. The complexity of the network which achieves this sub-exponential rate is shown to be algebraic in M.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/01/2018

Exponential Convergence of the Deep Neural Network Approximation for Analytic Functions

We prove that for analytic functions in low dimension, the convergence r...
research
11/07/2019

ChebNet: Efficient and Stable Constructions of Deep Neural Networks with Rectified Power Units using Chebyshev Approximations

In a recent paper[B. Li, S. Tang and H. Yu, arXiv:1903.05858, to appear ...
research
01/08/2019

Deep Neural Network Approximation Theory

Deep neural networks have become state-of-the-art technology for a wide ...
research
04/29/2021

Efficient Spectral Methods for Quasi-Equilibrium Closure Approximations of Symmetric Problems on Unit Circle and Sphere

Quasi-equilibrium approximation is a widely used closure approximation a...
research
09/09/2020

1-Dimensional polynomial neural networks for audio signal related problems

In addition to being extremely non-linear, modern problems require milli...
research
01/27/2021

Partition of unity networks: deep hp-approximation

Approximation theorists have established best-in-class optimal approxima...
research
04/29/2021

Fast Multiscale Diffusion on Graphs

Diffusing a graph signal at multiple scales requires computing the actio...

Please sign up or login with your details

Forgot password? Click here to reset