The Role of Depth, Width, and Activation Complexity in the Number of Linear Regions of Neural Networks

06/17/2022
by   Alexis Goujon, et al.
0

Many feedforward neural networks generate continuous and piecewise-linear (CPWL) mappings. Specifically, they partition the input domain into regions on which the mapping is an affine function. The number of these so-called linear regions offers a natural metric to characterize the expressiveness of CPWL mappings. Although the precise determination of this quantity is often out of reach, bounds have been proposed for specific architectures, including the well-known ReLU and Maxout networks. In this work, we propose a more general perspective and provide precise bounds on the maximal number of linear regions of CPWL networks based on three sources of expressiveness: depth, width, and activation complexity. Our estimates rely on the combinatorial structure of convex partitions and highlight the distinctive role of depth which, on its own, is able to exponentially increase the number of regions. We then introduce a complementary stochastic framework to estimate the average number of linear regions produced by a CPWL network architecture. Under reasonable assumptions, the expected density of linear regions along any 1D path is bounded by the product of depth, width, and a measure of activation complexity (up to a scaling factor). This yields an identical role to the three sources of expressiveness: no exponential growth with depth is observed anymore.

READ FULL TEXT

page 4

page 5

page 7

page 11

page 16

research
06/29/2018

Bounds on the Approximation Power of Feedforward Neural Networks

The approximation power of general feedforward neural networks with piec...
research
02/08/2014

On the Number of Linear Regions of Deep Neural Networks

We study the complexity of functions computable by deep feedforward neur...
research
07/15/2022

Algorithmic Determination of the Combinatorial Structure of the Linear Regions of ReLU Neural Networks

We algorithmically determine the regions and facets of all dimensions of...
research
02/23/2022

Are All Linear Regions Created Equal?

The number of linear regions has been studied as a proxy of complexity f...
research
01/25/2019

Complexity of Linear Regions in Deep Networks

It is well-known that the expressivity of a neural network depends on it...
research
02/28/2018

Neural Networks Should Be Wide Enough to Learn Disconnected Decision Regions

In the recent literature the important role of depth in deep learning ha...
research
03/29/2019

Deep Representation with ReLU Neural Networks

We consider deep feedforward neural networks with rectified linear units...

Please sign up or login with your details

Forgot password? Click here to reset