Local and global topological complexity measures OF ReLU neural network functions

04/12/2022
by   J. Elisenda Grigsby, et al.
0

We apply a generalized piecewise-linear (PL) version of Morse theory due to Grunert-Kuhnel-Rote to define and study new local and global notions of topological complexity for fully-connected feedforward ReLU neural network functions, F: R^n -> R. Along the way, we show how to construct, for each such F, a canonical polytopal complex K(F) and a deformation retract of the domain onto K(F), yielding a convenient compact model for performing calculations. We also give a combinatorial description of local complexity for depth 2 networks, and a construction showing that local complexity can be arbitrarily high.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/20/2020

On transversality of bent hyperplane arrangements and the topological expressiveness of ReLU neural networks

Let F:R^n -> R be a feedforward ReLU neural network. It is well-known th...
research
09/08/2022

Functional dimension of feedforward ReLU neural networks

It is well-known that the parameterized family of functions representabl...
research
03/06/2023

Finding metastable skyrmionic structures via a metaheuristic perturbation-driven neural network

Topological magnetic textures observed in experiments can, in principle,...
research
12/31/2020

Topological obstructions in neural networks learning

We apply methods of topological data analysis to loss functions to gain ...
research
06/15/2022

Local Identifiability of Deep ReLU Neural Networks: the Theory

Is a sample rich enough to determine, at least locally, the parameters o...
research
11/20/2021

SPINE: Soft Piecewise Interpretable Neural Equations

Relu Fully Connected Networks are ubiquitous but uninterpretable because...
research
06/12/2023

Polyhedral Complex Extraction from ReLU Networks using Edge Subdivision

A neural network consisting of piecewise affine building blocks, such as...

Please sign up or login with your details

Forgot password? Click here to reset