Optimal Approximation Complexity of High-Dimensional Functions with Neural Networks

01/30/2023
by   Vincent P. H. Goverse, et al.
0

We investigate properties of neural networks that use both ReLU and x^2 as activation functions and build upon previous results to show that both analytic functions and functions in Sobolev spaces can be approximated by such networks of constant depth to arbitrary accuracy, demonstrating optimal order approximation rates across all nonlinear approximators, including standard ReLU networks. We then show how to leverage low local dimensionality in some contexts to overcome the curse of dimensionality, obtaining approximation rates that are optimal for unknown lower-dimensional subspaces.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/09/2019

Efficient approximation of high-dimensional functions with deep neural networks

In this paper, we develop an approximation theory for deep neural networ...
research
12/14/2020

High-Order Approximation Rates for Neural Networks with ReLU^k Activation Functions

We study the approximation properties of shallow neural networks (NN) wi...
research
08/06/2020

ReLU nets adapt to intrinsic dimensionality beyond the target domain

We study the approximation of two-layer compositions f(x) = g(ϕ(x)) via ...
research
05/15/2021

Universality and Optimality of Structured Deep Kernel Networks

Kernel based methods yield approximation models that are flexible, effic...
research
01/29/2021

Optimal Approximation Rates and Metric Entropy of ReLU^k and Cosine Networks

This article addresses several fundamental issues associated with the ap...
research
02/26/2019

Nonlinear Approximation via Compositions

We study the approximation efficiency of function compositions in nonlin...
research
06/22/2019

The phase diagram of approximation rates for deep neural networks

We explore the phase diagram of approximation rates for deep neural netw...

Please sign up or login with your details

Forgot password? Click here to reset