On minimal representations of shallow ReLU networks

08/12/2021
by   S. Dereich, et al.
0

The realization function of a shallow ReLU network is a continuous and piecewise affine function f:ℝ^d→ℝ, where the domain ℝ^d is partitioned by a set of n hyperplanes into cells on which f is affine. We show that the minimal representation for f uses either n, n+1 or n+2 neurons and we characterize each of the three cases. In the particular case, where the input layer is one-dimensional, minimal representations always use at most n+1 neurons but in all higher dimensional settings there are functions for which n+2 neurons are needed. Then we show that the set of minimal networks representing f forms a C^∞-submanifold M and we derive the dimension and the number of connected components of M. Additionally, we give a criterion for the hyperplanes that guarantees that all continuous, piecewise affine functions are realization functions of appropriate ReLU networks.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset
Success!
Error Icon An error occurred

Sign in with Google

×

Use your Google Account to sign in to DeepAI

×

Consider DeepAI Pro