Algorithmic Determination of the Combinatorial Structure of the Linear Regions of ReLU Neural Networks

07/15/2022
by   Marissa Masden, et al.
0

We algorithmically determine the regions and facets of all dimensions of the canonical polyhedral complex, the universal object into which a ReLU network decomposes its input space. We show that the locations of the vertices of the canonical polyhedral complex along with their signs with respect to layer maps determine the full facet structure across all dimensions. We present an algorithm which calculates this full combinatorial structure, making use of our theorems that the dual complex to the canonical polyhedral complex is cubical and it possesses a multiplication compatible with its facet structure. The resulting algorithm is numerically stable, polynomial time in the number of intermediate neurons, and obtains accurate information across all dimensions. This permits us to obtain, for example, the true topology of the decision boundaries of networks with low-dimensional inputs. We run empirics on such networks at initialization, finding that width alone does not increase observed topology, but width in the presence of depth does. Source code for our algorithms is accessible online at https://github.com/mmasden/canonicalpoly.

READ FULL TEXT
research
05/29/2023

Minimum Width of Leaky-ReLU Neural Networks for Uniform Universal Approximation

The study of universal approximation properties (UAP) for neural network...
research
06/17/2022

The Role of Depth, Width, and Activation Complexity in the Number of Linear Regions of Neural Networks

Many feedforward neural networks generate continuous and piecewise-linea...
research
06/12/2023

Polyhedral Complex Extraction from ReLU Networks using Edge Subdivision

A neural network consisting of piecewise affine building blocks, such as...
research
08/20/2020

On transversality of bent hyperplane arrangements and the topological expressiveness of ReLU neural networks

Let F:R^n -> R be a feedforward ReLU neural network. It is well-known th...
research
02/20/2023

Depth Degeneracy in Neural Networks: Vanishing Angles in Fully Connected ReLU Networks on Initialization

Stacking many layers to create truly deep neural networks is arguably wh...
research
02/28/2019

Distance-Based Independence Screening for Canonical Analysis

This paper introduces a new method named Distance-based Independence Scr...

Please sign up or login with your details

Forgot password? Click here to reset