Universal Function Approximation by Deep Neural Nets with Bounded Width and ReLU Activations

08/09/2017
by   Boris Hanin, et al.
0

This article concerns the expressive power of depth in neural nets with ReLU activations and bounded width. We are particularly interested in the following questions: what is the minimal width w_min(d) so that ReLU nets of width w_min(d) (and arbitrary depth) can approximate any continuous function on the unit cube [0,1]^d aribitrarily well? For ReLU nets near this minimal width, what can one say about the depth necessary to approximate a given function? We obtain an essentially complete answer to these questions for convex functions. Our approach is based on the observation that, due to the convexity of the ReLU activation, ReLU nets are particularly well-suited for representing convex functions. In particular, we prove that ReLU nets with width d+1 can approximate any continuous convex function of d variables arbitrarily well. Moreover, when approximating convex, piecewise affine functions by such nets, we obtain matching upper and lower bounds on the required depth, proving that our construction is essentially optimal. These results then give quantitative depth estimates for the rate of approximation of any continuous scalar function on the d-dimensional cube [0,1]^d by ReLU nets with width d+3.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/31/2017

Approximating Continuous Functions by ReLU Nets of Minimal Width

This article concerns the expressive power of depth in deep feed-forward...
research
02/28/2021

Optimal Approximation Rate of ReLU Networks in terms of Width and Depth

This paper concentrates on the approximation power of deep feed-forward ...
research
06/13/2019

Deep Network Approximation Characterized by Number of Neurons

This paper quantitatively characterizes the approximation power of deep ...
research
09/23/2021

Arbitrary-Depth Universal Approximation Theorems for Operator Neural Networks

The standard Universal Approximation Theorem for operator neural network...
research
09/26/2018

Deep Neural Networks for Estimation and Inference: Application to Causal Effects and Other Semiparametric Estimands

We study deep neural networks and their use in semiparametric inference....
research
08/06/2020

ReLU nets adapt to intrinsic dimensionality beyond the target domain

We study the approximation of two-layer compositions f(x) = g(ϕ(x)) via ...
research
02/28/2017

Deep Semi-Random Features for Nonlinear Function Approximation

We propose semi-random features for nonlinear function approximation. Th...

Please sign up or login with your details

Forgot password? Click here to reset