Trajectory growth lower bounds for random sparse deep ReLU networks

11/25/2019
by   Ilan Price, et al.
0

This paper considers the growth in the length of one-dimensional trajectories as they are passed through deep ReLU neural networks, which, among other things, is one measure of the expressivity of deep networks. We generalise existing results, providing an alternative, simpler method for lower bounding expected trajectory growth through random networks, for a more general class of weights distributions, including sparsely connected networks. We illustrate this approach by deriving bounds for sparse-Gaussian, sparse-uniform, and sparse-discrete-valued random nets. We prove that trajectory growth can remain exponential in depth with these new distributions, including their sparse variants, with the sparsity parameter appearing in the base of the exponent.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

11/08/2017

Lower bounds over Boolean inputs for deep neural networks with ReLU gates

Motivated by the resurgence of neural networks in being able to solve co...
02/03/2021

On the Approximation Power of Two-Layer Networks of Random ReLUs

This paper considers the following question: how well can depth-two ReLU...
06/16/2016

On the Expressive Power of Deep Neural Networks

We propose a new approach to the problem of neural network expressivity,...
07/17/2020

Sparse-grid sampling recovery and deep ReLU neural networks in high-dimensional approximation

We investigate approximations of functions from the Hölder-Zygmund space...
09/26/2018

Deep Neural Networks for Estimation and Inference: Application to Causal Effects and Other Semiparametric Estimands

We study deep neural networks and their use in semiparametric inference....
02/21/2021

Deep ReLU Networks Preserve Expected Length

Assessing the complexity of functions computed by a neural network helps...
10/26/2018

Size-Noise Tradeoffs in Generative Networks

This paper investigates the ability of generative networks to convert th...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.