ReLU Neural Networks, Polyhedral Decompositions, and Persistent Homolog

06/30/2023
by   Yajing Liu, et al.
0

A ReLU neural network leads to a finite polyhedral decomposition of input space and a corresponding finite dual graph. We show that while this dual graph is a coarse quantization of input space, it is sufficiently robust that it can be combined with persistent homology to detect homological signals of manifolds in the input space from samples. This property holds for a variety of networks trained for a wide range of purposes that have nothing to do with this topological application. We found this feature to be surprising and interesting; we hope it will also be useful.

READ FULL TEXT

page 5

page 12

research
11/23/2022

Dual Graphs of Polyhedral Decompositions for the Detection of Adversarial Attacks

Previous work has shown that a neural network with the rectified linear ...
research
11/30/2020

Locally Linear Attributes of ReLU Neural Networks

A ReLU neural network determines/is a continuous piecewise linear map fr...
research
05/10/2020

Duality in Persistent Homology of Images

We derive the relationship between the persistent homology barcodes of t...
research
05/16/2023

Unwrapping All ReLU Networks

Deep ReLU Networks can be decomposed into a collection of linear models,...
research
10/12/2021

Localized Persistent Homologies for more Effective Deep Learning

Persistent Homologies have been successfully used to increase the perfor...
research
11/17/2021

Traversing the Local Polytopes of ReLU Neural Networks: A Unified Approach for Network Verification

Although neural networks (NNs) with ReLU activation functions have found...
research
11/28/2017

Adversary Detection in Neural Networks via Persistent Homology

We outline a detection method for adversarial inputs to deep neural netw...

Please sign up or login with your details

Forgot password? Click here to reset