Prior Activation Distribution (PAD): A Versatile Representation to Utilize DNN Hidden Units

07/05/2019
by   Lakmal Meegahapola, et al.
0

In this paper, we introduce the concept of Prior Activation Distribution (PAD) as a versatile and general technique to capture the typical activation patterns of hidden layer units of a Deep Neural Network used for classification tasks. We show that the combined neural activations of such a hidden layer have class-specific distributional properties, and then define multiple statistical measures to compute how far a test sample's activations deviate from such distributions. Using a variety of benchmark datasets (including MNIST, CIFAR10, Fashion-MNIST & notMNIST), we show how such PAD-based measures can be used, independent of any training technique, to (a) derive fine-grained uncertainty estimates for inferences; (b) provide inferencing accuracy competitive with alternatives that require execution of the full pipeline, and (c) reliably isolate out-of-distribution test samples.

READ FULL TEXT
research
09/06/2020

Multi-Activation Hidden Units for Neural Networks with Random Weights

Single layer feedforward networks with random weights are successful in ...
research
04/23/2023

Improving Classification Neural Networks by using Absolute activation function (MNIST/LeNET-5 example)

The paper discusses the use of the Absolute activation function in class...
research
10/10/2019

Coloring the Black Box: Visualizing neural network behavior with a self-introspective model

The following work presents how autoencoding all the possible hidden act...
research
06/20/2018

Learning ReLU Networks via Alternating Minimization

We propose and analyze a new family of algorithms for training neural ne...
research
07/15/2021

Training for temporal sparsity in deep neural networks, application in video processing

Activation sparsity improves compute efficiency and resource utilization...
research
04/12/2022

Examining the Proximity of Adversarial Examples to Class Manifolds in Deep Networks

Deep neural networks achieve remarkable performance in multiple fields. ...
research
08/17/2021

KCNet: An Insect-Inspired Single-Hidden-Layer Neural Network with Randomized Binary Weights for Prediction and Classification Tasks

Fruit flies are established model systems for studying olfactory learnin...

Please sign up or login with your details

Forgot password? Click here to reset