Locally Linear Attributes of ReLU Neural Networks

11/30/2020
by   Ben Sattelberg, et al.
0

A ReLU neural network determines/is a continuous piecewise linear map from an input space to an output space. The weights in the neural network determine a decomposition of the input space into convex polytopes and on each of these polytopes the network can be described by a single affine mapping. The structure of the decomposition, together with the affine map attached to each polytope, can be analyzed to investigate the behavior of the associated neural network.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 6

page 7

page 9

page 11

page 16

08/20/2020

On transversality of bent hyperplane arrangements and the topological expressiveness of ReLU neural networks

Let F:R^n -> R be a feedforward ReLU neural network. It is well-known th...
10/14/2021

Sound and Complete Neural Network Repair with Minimality and Locality Guarantees

We present a novel methodology for repairing neural networks that use Re...
09/29/2021

Double framed moduli spaces of quiver representations

Motivated by problems in the neural networks setting, we study moduli sp...
03/29/2019

Deep Representation with ReLU Neural Networks

We consider deep feedforward neural networks with rectified linear units...
10/02/2019

Identifying Weights and Architectures of Unknown ReLU Networks

The output of a neural network depends on its parameters in a highly non...
08/17/2019

Computing Linear Restrictions of Neural Networks

A linear restriction of a function is the same function with its domain ...
11/17/2021

Traversing the Local Polytopes of ReLU Neural Networks: A Unified Approach for Network Verification

Although neural networks (NNs) with ReLU activation functions have found...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.