Benefits of depth in neural networks

02/14/2016
by   Matus Telgarsky, et al.
0

For any positive integer k, there exist neural networks with Θ(k^3) layers, Θ(1) nodes per layer, and Θ(1) distinct parameters which can not be approximated by networks with O(k) layers unless they are exponentially large --- they must possess Ω(2^k) nodes. This result is proved here for a class of nodes termed "semi-algebraic gates" which includes the common choices of ReLU, maximum, indicator, and piecewise polynomial functions, therefore establishing benefits of depth against not just standard networks with ReLU gates, but also convolutional networks with ReLU and maximization gates, sum-product networks, and boosted decision trees (in this last case with a stronger separation: Ω(2^k^3) total tree nodes are required).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/27/2015

Representation Benefits of Deep Feedforward Networks

This note provides a family of classification problems, indexed by a pos...
research
11/08/2017

Lower bounds over Boolean inputs for deep neural networks with ReLU gates

Motivated by the resurgence of neural networks in being able to solve co...
research
10/10/2018

Random ReLU Features: Universality, Approximation, and Composition

We propose random ReLU features models in this work. Its motivation is r...
research
11/04/2016

Understanding Deep Neural Networks with Rectified Linear Units

In this paper we investigate the family of functions representable by de...
research
02/28/2019

A lattice-based approach to the expressivity of deep ReLU neural networks

We present new families of continuous piecewise linear (CPWL) functions ...
research
05/16/2023

Unwrapping All ReLU Networks

Deep ReLU Networks can be decomposed into a collection of linear models,...
research
05/18/2018

Two geometric input transformation methods for fast online reinforcement learning with neural nets

We apply neural nets with ReLU gates in online reinforcement learning. O...

Please sign up or login with your details

Forgot password? Click here to reset