On minimal representations of shallow ReLU networks

08/12/2021
by   S. Dereich, et al.
0

The realization function of a shallow ReLU network is a continuous and piecewise affine function f:ℝ^d→ℝ, where the domain ℝ^d is partitioned by a set of n hyperplanes into cells on which f is affine. We show that the minimal representation for f uses either n, n+1 or n+2 neurons and we characterize each of the three cases. In the particular case, where the input layer is one-dimensional, minimal representations always use at most n+1 neurons but in all higher dimensional settings there are functions for which n+2 neurons are needed. Then we show that the set of minimal networks representing f forms a C^∞-submanifold M and we derive the dimension and the number of connected components of M. Additionally, we give a criterion for the hyperplanes that guarantees that all continuous, piecewise affine functions are realization functions of appropriate ReLU networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/19/2021

Landscape analysis for shallow ReLU neural networks: complete classification of critical points for affine target functions

In this paper, we analyze the landscape of the true loss of a ReLU neura...
research
07/20/2021

An Embedding of ReLU Networks and an Analysis of their Identifiability

Neural networks with the Rectified Linear Unit (ReLU) nonlinearity are d...
research
07/25/2023

Piecewise Linear Functions Representable with Infinite Width Shallow ReLU Neural Networks

This paper analyzes representations of continuous piecewise linear funct...
research
06/12/2023

Polyhedral Complex Extraction from ReLU Networks using Edge Subdivision

A neural network consisting of piecewise affine building blocks, such as...
research
03/21/2023

Sampling from a Gaussian distribution conditioned on the level set of a piecewise affine, continuous function

We consider how to use Hamiltonian Monte Carlo to sample from a distribu...
research
06/16/2019

A General Interpretation of Deep Learning by Affine Transform and Region Dividing without Mutual Interference

This paper mainly deals with the "black-box" problem of deep learning co...
research
10/13/2022

Improved Bounds on Neural Complexity for Representing Piecewise Linear Functions

A deep neural network using rectified linear units represents a continuo...

Please sign up or login with your details

Forgot password? Click here to reset