Functional dimension of feedforward ReLU neural networks

09/08/2022
by   J. Elisenda Grigsby, et al.
0

It is well-known that the parameterized family of functions representable by fully-connected feedforward neural networks with ReLU activation function is precisely the class of piecewise linear functions with finitely many pieces. It is less well-known that for every fixed architecture of ReLU neural network, the parameter space admits positive-dimensional spaces of symmetries, and hence the local functional dimension near any given parameter is lower than the parametric dimension. In this work we carefully define the notion of functional dimension, show that it is inhomogeneous across the parameter space of ReLU neural network functions, and continue an investigation - initiated in [14] and [5] - into when the functional dimension achieves its theoretical maximum. We also study the quotient space and fibers of the realization map from parameter space to function space, supplying examples of fibers that are disconnected, fibers upon which functional dimension is non-constant, and fibers upon which the symmetry group acts non-transitively.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/09/2023

Hidden symmetries of ReLU networks

The parameter space for any fixed architecture of feedforward ReLU neura...
research
07/25/2023

Piecewise Linear Functions Representable with Infinite Width Shallow ReLU Neural Networks

This paper analyzes representations of continuous piecewise linear funct...
research
08/20/2020

On transversality of bent hyperplane arrangements and the topological expressiveness of ReLU neural networks

Let F:R^n -> R be a feedforward ReLU neural network. It is well-known th...
research
12/24/2021

Parameter identifiability of a deep feedforward ReLU neural network

The possibility for one to recover the parameters-weights and biases-of ...
research
04/12/2022

Local and global topological complexity measures OF ReLU neural network functions

We apply a generalized piecewise-linear (PL) version of Morse theory due...
research
06/01/2021

Symmetry-via-Duality: Invariant Neural Network Densities from Parameter-Space Correlators

Parameter-space and function-space provide two different duality frames ...
research
05/19/2023

Complexity of Feed-Forward Neural Networks from the Perspective of Functional Equivalence

In this paper, we investigate the complexity of feed-forward neural netw...

Please sign up or login with your details

Forgot password? Click here to reset