Improved Bounds on Neural Complexity for Representing Piecewise Linear Functions

10/13/2022
by   Kuan-Lin Chen, et al.
0

A deep neural network using rectified linear units represents a continuous piecewise linear (CPWL) function and vice versa. Recent results in the literature estimated that the number of neurons needed to exactly represent any CPWL function grows exponentially with the number of pieces or exponentially in terms of the factorial of the number of distinct linear components. Moreover, such growth is amplified linearly with the input dimension. These existing results seem to indicate that the cost of representing a CPWL function is expensive. In this paper, we propose much tighter bounds and establish a polynomial time algorithm to find a network satisfying these bounds for any given CPWL function. We prove that the number of hidden neurons required to exactly represent any CPWL function is at most a quadratic function of the number of pieces. In contrast to all previous results, this upper bound is invariant to the input dimension. Besides the number of pieces, we also study the number of distinct linear components in CPWL functions. When such a number is also given, we prove that the quadratic complexity turns into bilinear, which implies a lower neural complexity because the number of distinct linear components is always not greater than the minimum number of pieces in a CPWL function. When the number of pieces is unknown, we prove that, in terms of the number of distinct linear components, the neural complexities of any CPWL function are at most polynomial growth for low-dimensional inputs and factorial growth for the worst-case scenario, which are significantly better than existing results in the literature.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/25/2021

Approximating Probability Distributions by ReLU Networks

How many neurons are needed to approximate a target probability distribu...
research
01/25/2019

Complexity of Linear Regions in Deep Networks

It is well-known that the expressivity of a neural network depends on it...
research
09/25/2019

On the linear structures of Balanced functions and quadratic APN functions

The set of linear structures of most known balanced Boolean functions is...
research
06/27/2019

Improved Upper Bounds on the Growth Constants of Polyominoes and Polycubes

A d-dimensional polycube is a facet-connected set of cells (cubes) on th...
research
08/12/2021

On minimal representations of shallow ReLU networks

The realization function of a shallow ReLU network is a continuous and p...
research
04/24/2022

Complexity and Avoidance

In this dissertation we examine the relationships between the several hi...
research
06/17/2022

Interior point methods are not worse than Simplex

Whereas interior point methods provide polynomial-time linear programmin...

Please sign up or login with your details

Forgot password? Click here to reset