High-Order Approximation Rates for Neural Networks with ReLU^k Activation Functions

12/14/2020
by   Jonathan W. Siegel, et al.
0

We study the approximation properties of shallow neural networks (NN) with activation function which is a power of the rectified linear unit. Specifically, we consider the dependence of the approximation rate on the dimension and the smoothness of the underlying function to be approximated. Like the finite element method, such networks represent piecewise polynomial functions. However, we show that for sufficiently smooth functions the approximation properties of shallow ReLU^k networks are much better than finite elements or wavelets, and they even overcome the curse of dimensionality more effectively than the sparse grid method. Specifically, for a sufficiently smooth function f, there exists a ReLU^k-NN with n neurons which approximates f in L^2([0,1]^d) with O(n^-(k+1)log(n)) error. Finally, we prove lower bounds showing that the approximation rates attained are optimal under the given assumptions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/28/2021

Sharp Lower Bounds on the Approximation Rate of Shallow Neural Networks

We consider the approximation rates of shallow neural networks with resp...
research
04/24/2022

Piecewise-Linear Activations or Analytic Activation Functions: Which Produce More Expressive Neural Networks?

Many currently available universal approximation theorems affirm that de...
research
01/30/2023

Optimal Approximation Complexity of High-Dimensional Functions with Neural Networks

We investigate properties of neural networks that use both ReLU and x^2 ...
research
05/22/2023

DeepBern-Nets: Taming the Complexity of Certifying Neural Networks using Bernstein Polynomial Activations and Precise Bound Propagation

Formal certification of Neural Networks (NNs) is crucial for ensuring th...
research
12/31/2022

Smooth Mathematical Function from Compact Neural Networks

This is paper for the smooth function approximation by neural networks (...
research
12/28/2020

Neural Network Approximation

Neural Networks (NNs) are the method of choice for building learning alg...
research
07/30/2020

Approximation of Smoothness Classes by Deep ReLU Networks

We consider approximation rates of sparsely connected deep rectified lin...

Please sign up or login with your details

Forgot password? Click here to reset