ReLU Code Space: A Basis for Rating Network Quality Besides Accuracy

05/20/2020
by   Natalia Shepeleva, et al.
10

We propose a new metric space of ReLU activation codes equipped with a truncated Hamming distance which establishes an isometry between its elements and polyhedral bodies in the input space which have recently been shown to be strongly related to safety, robustness, and confidence. This isometry allows the efficient computation of adjacency relations between the polyhedral bodies. Experiments on MNIST and CIFAR-10 indicate that information besides accuracy might be stored in the code space.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/01/2022

Rotate the ReLU to implicitly sparsify deep networks

In the era of Deep Neural Network based solutions for a variety of real-...
research
11/17/2021

Traversing the Local Polytopes of ReLU Neural Networks: A Unified Approach for Network Verification

Although neural networks (NNs) with ReLU activation functions have found...
research
11/02/2019

The Lattice Structure of Linear Subspace Codes

The projective space P_q(n), i.e. the set of all subspaces of the vector...
research
03/22/2020

Dynamic ReLU

Rectified linear units (ReLU) are commonly used in deep neural networks....
research
12/22/2021

Divisible Codes

A linear code over 𝔽_q with the Hamming metric is called Δ-divisible if ...
research
11/23/2022

Dual Graphs of Polyhedral Decompositions for the Detection of Adversarial Attacks

Previous work has shown that a neural network with the rectified linear ...

Please sign up or login with your details

Forgot password? Click here to reset