Function approximation by deep neural networks with parameters {0,±1/2, ± 1, 2}

03/15/2021
by   Aleksandr Beknazaryan, et al.
0

In this paper it is shown that C_β-smooth functions can be approximated by neural networks with parameters {0,±1/2, ± 1, 2}. The depth, width and the number of active parameters of constructed networks have, up to a logarithimc factor, the same dependence on the approximation error as the networks with parameters in [-1,1]. In particular, this means that the nonparametric regression estimation with constructed networks attain the same convergence rate as with the sparse networks with parameters in [-1,1].

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/13/2020

Approximation smooth and sparse functions by deep neural networks without saturation

Constructing neural networks for function approximation is a classical a...
research
07/28/2021

Neural Network Approximation of Refinable Functions

In the desire to quantify the success of neural networks in deep learnin...
research
10/26/2020

Provable Memorization via Deep Neural Networks using Sub-linear Parameters

It is known that Θ(N) parameters are sufficient for neural networks to m...
research
06/22/2021

The Rate of Convergence of Variation-Constrained Deep Neural Networks

Multi-layer feedforward networks have been used to approximate a wide ra...
research
04/17/2023

Deep Neural Network Approximation of Composition Functions: with application to PINNs

In this paper, we focus on approximating a natural class of functions th...
research
03/27/2020

On the Optimization Dynamics of Wide Hypernetworks

Recent results in the theoretical study of deep learning have shown that...
research
10/27/2022

On the Approximation and Complexity of Deep Neural Networks to Invariant Functions

Recent years have witnessed a hot wave of deep neural networks in variou...

Please sign up or login with your details

Forgot password? Click here to reset