Complexity of Linear Regions in Deep Networks

01/25/2019
by   Boris Hanin, et al.
0

It is well-known that the expressivity of a neural network depends on its architecture, with deeper networks expressing more complex functions. In the case of networks that compute piecewise linear functions, such as those with ReLU activation, the number of distinct linear regions is a natural measure of expressivity. It is possible to construct networks for which the number of linear regions grows exponentially with depth, or with merely a single region; it is not clear where within this range most networks fall in practice, either before or after training. In this paper, we provide a mathematical framework to count the number of linear regions of a piecewise linear network and measure the volume of the boundaries between these regions. In particular, we prove that for networks at initialization, the average number of regions along any one-dimensional subspace grows linearly in the total number of neurons, far below the exponential upper bound. We also find that the average distance to the nearest region boundary at initialization scales like the inverse of the number of neurons. Our theory suggests that, even after training, the number of linear regions is far below exponential, an intuition that matches our empirical observations. We conclude that the practical expressivity of neural networks is likely far below that of the theoretical maximum, and that this gap can be quantified.

READ FULL TEXT

page 1

page 2

page 7

research
06/03/2019

Deep ReLU Networks Have Surprisingly Few Activation Patterns

The success of deep networks has been attributed in part to their expres...
research
05/16/2017

The power of deeper networks for expressing natural functions

It is well-known that neural networks are universal approximators, but t...
research
10/13/2022

Improved Bounds on Neural Complexity for Representing Piecewise Linear Functions

A deep neural network using rectified linear units represents a continuo...
research
07/14/2020

Bounding The Number of Linear Regions in Local Area for Neural Networks with ReLU Activations

The number of linear regions is one of the distinct properties of the ne...
research
07/01/2021

On the Expected Complexity of Maxout Networks

Learning with neural networks relies on the complexity of the representa...
research
06/17/2022

The Role of Depth, Width, and Activation Complexity in the Number of Linear Regions of Neural Networks

Many feedforward neural networks generate continuous and piecewise-linea...
research
05/27/2019

Expression of Fractals Through Neural Network Functions

To help understand the underlying mechanisms of neural networks (NNs), s...

Please sign up or login with your details

Forgot password? Click here to reset