ReLU Network Approximation in Terms of Intrinsic Parameters

11/15/2021
by   Zuowei Shen, et al.
0

This paper studies the approximation error of ReLU networks in terms of the number of intrinsic parameters (i.e., those depending on the target function f). First, we prove by construction that, for any Lipschitz continuous function f on [0,1]^d with a Lipschitz constant λ>0, a ReLU network with n+2 intrinsic parameters can approximate f with an exponentially small error 5λ√(d) 2^-n measured in the L^p-norm for p∈ [1,∞). More generally for an arbitrary continuous function f on [0,1]^d with a modulus of continuity ω_f(·), the approximation error is ω_f(√(d) 2^-n)+2^-n+2ω_f(√(d)). Next, we extend these two results from the L^p-norm to the L^∞-norm at a price of 3^d n+2 intrinsic parameters. Finally, by using a high-precision binary representation and the bit extraction technique via a fixed ReLU network independent of the target function, we design, theoretically, a ReLU network with only three intrinsic parameters to approximate Hölder continuous functions with an arbitrarily small error.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/28/2021

Optimal Approximation Rate of ReLU Networks in terms of Width and Depth

This paper concentrates on the approximation power of deep feed-forward ...
research
01/29/2023

On Enhancing Expressive Power via Compositions of Single Fixed-Size ReLU Network

This paper studies the expressive power of deep neural networks from the...
research
06/22/2020

Deep Network Approximation with Discrepancy Being Reciprocal of Width to Power of Depth

A new network with super approximation power is introduced. This network...
research
06/13/2019

Deep Network Approximation Characterized by Number of Neurons

This paper quantitatively characterizes the approximation power of deep ...
research
09/28/2020

Learning Deep ReLU Networks Is Fixed-Parameter Tractable

We consider the problem of learning an unknown ReLU network with respect...
research
07/25/2023

Differential approximation of the Gaussian by short cosine sums with exponential error decay

In this paper we propose a method to approximate the Gaussian function o...
research
05/13/2020

The effect of Target Normalization and Momentum on Dying ReLU

Optimizing parameters with momentum, normalizing data values, and using ...

Please sign up or login with your details

Forgot password? Click here to reset