Deep Network Approximation with Discrepancy Being Reciprocal of Width to Power of Depth

06/22/2020
by   Zuowei Shen, et al.
0

A new network with super approximation power is introduced. This network is built with Floor (⌊ x⌋) and ReLU (max{0,x}) activation functions and hence we call such networks as Floor-ReLU networks. It is shown by construction that Floor-ReLU networks with width max{d, 5N+13} and depth 64dL+3 can pointwise approximate a Lipschitz continuous function f on [0,1]^d with an exponential approximation rate 3μ√(d) N^-√(L), where μ is the Lipschitz constant of f. More generally for an arbitrary continuous function f on [0,1]^d with a modulus of continuity ω_f(·), the constructive approximation rate is ω_f(√(d) N^-√(L))+2ω_f(√(d))N^-√(L). As a consequence, this new network overcomes the curse of dimensionality in approximation power since this approximation order is essentially √(d) times a function of N and L independent of d.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/25/2020

Neural Network Approximation: Three Hidden Layers Are Enough

A three-hidden-layer neural network with super approximation power is in...
research
02/28/2021

Optimal Approximation Rate of ReLU Networks in terms of Width and Depth

This paper concentrates on the approximation power of deep feed-forward ...
research
02/28/2021

Deep Neural Networks with ReLU-Sine-Exponential Activations Break Curse of Dimensionality on Hölder Class

In this paper, we construct neural networks with ReLU, sine and 2^x as a...
research
01/29/2023

On Enhancing Expressive Power via Compositions of Single Fixed-Size ReLU Network

This paper studies the expressive power of deep neural networks from the...
research
07/19/2023

Deep Operator Network Approximation Rates for Lipschitz Operators

We establish universality and expression rate bounds for a class of neur...
research
06/13/2019

Deep Network Approximation Characterized by Number of Neurons

This paper quantitatively characterizes the approximation power of deep ...
research
11/15/2021

ReLU Network Approximation in Terms of Intrinsic Parameters

This paper studies the approximation error of ReLU networks in terms of ...

Please sign up or login with your details

Forgot password? Click here to reset