ReLU nets adapt to intrinsic dimensionality beyond the target domain

08/06/2020
by   Alexander Cloninger, et al.
0

We study the approximation of two-layer compositions f(x) = g(ϕ(x)) via deep ReLU networks, where ϕ is a nonlinear, geometrically intuitive, and dimensionality reducing feature map. We focus on two complementary choices for ϕ that are intuitive and frequently appearing in the statistical literature. The resulting approximation rates are near optimal and show adaptivity to intrinsic notions of complexity, which significantly extend a series of recent works on approximating targets over low-dimensional manifolds. Specifically, we show that ReLU nets can express functions, which are invariant to the input up to an orthogonal projection onto a low-dimensional manifold, with the same efficiency as if the target domain would be the manifold itself. This implies approximation via ReLU nets is faithful to an intrinsic dimensionality governed by the target f itself, rather than the dimensionality of the approximation domain. As an application of our approximation bounds, we study empirical risk minimization over a space of sparsely constrained ReLU nets under the assumption that the conditional expectation satisfies one of the proposed models. We show near-optimal estimation guarantees in regression and classifications problems, for which, to the best of our knowledge, no efficient estimator has been developed so far.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/02/2019

Deep ReLU network approximation of functions on a manifold

Whereas recovery of the manifold from data is a well-studied topic, appr...
research
01/30/2023

Optimal Approximation Complexity of High-Dimensional Functions with Neural Networks

We investigate properties of neural networks that use both ReLU and x^2 ...
research
07/30/2020

Approximation of Smoothness Classes by Deep ReLU Networks

We consider approximation rates of sparsely connected deep rectified lin...
research
08/09/2017

Universal Function Approximation by Deep Neural Nets with Bounded Width and ReLU Activations

This article concerns the expressive power of depth in neural nets with ...
research
04/03/2019

Deep Neural Networks for Rotation-Invariance Approximation and Learning

Based on the tree architecture, the objective of this paper is to design...
research
10/26/2018

Size-Noise Tradeoffs in Generative Networks

This paper investigates the ability of generative networks to convert th...

Please sign up or login with your details

Forgot password? Click here to reset