Rates of Approximation by ReLU Shallow Neural Networks

07/24/2023
by   Tong Mao, et al.
0

Neural networks activated by the rectified linear unit (ReLU) play a central role in the recent development of deep learning. The topic of approximating functions from Hölder spaces by these networks is crucial for understanding the efficiency of the induced learning algorithms. Although the topic has been well investigated in the setting of deep neural networks with many layers of hidden neurons, it is still open for shallow networks having only one hidden layer. In this paper, we provide rates of uniform approximation by these networks. We show that ReLU shallow neural networks with m hidden neurons can uniformly approximate functions from the Hölder space W_∞^r([-1, 1]^d) with rates O((log m)^1/2 +dm^-r/dd+2/d+4) when r<d/2 +2. Such rates are very close to the optimal one O(m^-r/d) in the sense that d+2/d+4 is close to 1, when the dimension d is large.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/04/2023

Optimal rates of approximation by shallow ReLU^k neural networks and applications to nonparametric regression

We study the approximation capacity of some variation spaces correspondi...
research
10/13/2016

Why Deep Neural Networks for Function Approximation?

Recently there has been much interest in understanding why deep neural n...
research
02/15/2019

Efficient Deep Learning of GMMs

We show that a collection of Gaussian mixture models (GMMs) in R^n can b...
research
06/18/2022

Piecewise Linear Neural Networks and Deep Learning

As a powerful modelling method, PieceWise Linear Neural Networks (PWLNNs...
research
07/28/2023

Optimal Approximation of Zonoids and Uniform Approximation by Shallow Neural Networks

We study the following two related problems. The first is to determine t...
research
11/03/2021

Regularization by Misclassification in ReLU Neural Networks

We study the implicit bias of ReLU neural networks trained by a variant ...

Please sign up or login with your details

Forgot password? Click here to reset