Locally Linear Attributes of ReLU Neural Networks

11/30/2020
by   Ben Sattelberg, et al.
0

A ReLU neural network determines/is a continuous piecewise linear map from an input space to an output space. The weights in the neural network determine a decomposition of the input space into convex polytopes and on each of these polytopes the network can be described by a single affine mapping. The structure of the decomposition, together with the affine map attached to each polytope, can be analyzed to investigate the behavior of the associated neural network.

READ FULL TEXT

page 6

page 7

page 9

page 11

page 16

research
07/25/2023

Piecewise Linear Functions Representable with Infinite Width Shallow ReLU Neural Networks

This paper analyzes representations of continuous piecewise linear funct...
research
10/14/2021

Sound and Complete Neural Network Repair with Minimality and Locality Guarantees

We present a novel methodology for repairing neural networks that use Re...
research
08/17/2019

Computing Linear Restrictions of Neural Networks

A linear restriction of a function is the same function with its domain ...
research
03/12/2018

R3Net: Random Weights, Rectifier Linear Units and Robustness for Artificial Neural Network

We consider a neural network architecture with randomized features, a si...
research
06/01/2023

Learning Prescriptive ReLU Networks

We study the problem of learning optimal policy from a set of discrete t...
research
03/29/2019

Deep Representation with ReLU Neural Networks

We consider deep feedforward neural networks with rectified linear units...
research
06/30/2023

ReLU Neural Networks, Polyhedral Decompositions, and Persistent Homolog

A ReLU neural network leads to a finite polyhedral decomposition of inpu...

Please sign up or login with your details

Forgot password? Click here to reset