On transversality of bent hyperplane arrangements and the topological expressiveness of ReLU neural networks

08/20/2020
by   J. Elisenda Grigsby, et al.
0

Let F:R^n -> R be a feedforward ReLU neural network. It is well-known that for any choice of parameters, F is continuous and piecewise (affine) linear. We lay some foundations for a systematic investigation of how the architecture of F impacts the geometry and topology of its possible decision regions for binary classification tasks. Following the classical progression for smooth functions in differential topology, we first define the notion of a generic, transversal ReLU neural network and show that almost all ReLU networks are generic and transversal. We then define a partially-oriented linear 1-complex in the domain of F and identify properties of this complex that yield an obstruction to the existence of bounded connected components of a decision region. We use this obstruction to prove that a decision region of a generic, transversal ReLU network F: R^n -> R with a single hidden layer of dimension (n + 1) can have no more than one bounded connected component.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/08/2022

Functional dimension of feedforward ReLU neural networks

It is well-known that the parameterized family of functions representabl...
research
04/12/2022

Local and global topological complexity measures OF ReLU neural network functions

We apply a generalized piecewise-linear (PL) version of Morse theory due...
research
01/25/2019

When Can Neural Networks Learn Connected Decision Regions?

Previous work has questioned the conditions under which the decision reg...
research
05/18/2018

Tropical Geometry of Deep Neural Networks

We establish, for the first time, connections between feedforward neural...
research
02/28/2018

Neural Networks Should Be Wide Enough to Learn Disconnected Decision Regions

In the recent literature the important role of depth in deep learning ha...
research
07/15/2022

Algorithmic Determination of the Combinatorial Structure of the Linear Regions of ReLU Neural Networks

We algorithmically determine the regions and facets of all dimensions of...
research
06/23/2021

Numerical influence of ReLU'(0) on backpropagation

In theory, the choice of ReLU'(0) in [0, 1] for a neural network has a n...

Please sign up or login with your details

Forgot password? Click here to reset