On the Approximation Power of Two-Layer Networks of Random ReLUs

02/03/2021
by   Daniel Hsu, et al.
0

This paper considers the following question: how well can depth-two ReLU networks with randomly initialized bottom-level weights represent smooth functions? We give near-matching upper- and lower-bounds for L_2-approximation in terms of the Lipschitz constant, the desired accuracy, and the dimension of the problem, as well as similar results in terms of Sobolev norms. Our positive results employ tools from harmonic analysis and ridgelet representation theory, while our lower-bounds are based on (robust versions of) dimensionality arguments.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/10/2021

Linear approximability of two-layer neural networks: A comprehensive analysis based on spectral decay

In this paper, we present a spectral-based approach to study the linear ...
research
06/30/2023

Efficient uniform approximation using Random Vector Functional Link networks

A Random Vector Functional Link (RVFL) network is a depth-2 neural netwo...
research
06/17/2018

On Sketching the q to p norms

We initiate the study of data dimensionality reduction, or sketching, fo...
research
06/07/2020

Sharp Representation Theorems for ReLU Networks with Precise Dependence on Depth

We prove sharp dimension-free representation results for neural networks...
research
11/25/2019

Trajectory growth lower bounds for random sparse deep ReLU networks

This paper considers the growth in the length of one-dimensional traject...
research
02/24/2023

Lower Bounds on the Depth of Integral ReLU Neural Networks via Lattice Polytopes

We prove that the set of functions representable by ReLU neural networks...
research
11/18/2019

Comments on the Du-Kakade-Wang-Yang Lower Bounds

Du, Kakade, Wang, and Yang recently established intriguing lower bounds ...

Please sign up or login with your details

Forgot password? Click here to reset