Dropout Prediction Variation Estimation Using Neuron Activation Strength

10/13/2021
by   Haichao Yu, et al.
46

It is well-known DNNs would generate different prediction results even given the same model configuration and training dataset. As a result, it becomes more and more important to study prediction variation, i.e. the variation of the predictions on a given input example, in neural network models. Dropout has been commonly used in various applications to quantify prediction variations. However, using dropout in practice can be expensive as it requires running dropout inference many times to estimate prediction variation. In this paper, we study how to estimate dropout prediction variation in a resource-efficient manner. In particular, we demonstrate that we can use neuron activation strength to estimate dropout prediction variation under different dropout settings and on a variety of tasks using three large datasets, MovieLens, Criteo, and EMNIST. Our approach provides an inference-once alternative to estimate dropout prediction variation as an auxiliary task when the main prediction model is served. Moreover, we show that using activation strength features from a subset of neural network layers can be sufficient to achieve similar variation estimation performance compared to using activation features from all layers. This can provide further resource reduction for variation estimation.

READ FULL TEXT

page 7

page 8

page 11

research
12/10/2018

Guided Dropout

Dropout is often used in deep neural networks to prevent over-fitting. C...
research
04/25/2019

Survey of Dropout Methods for Deep Neural Networks

Dropout methods are a family of stochastic techniques used in neural net...
research
05/08/2018

Image Ordinal Classification and Understanding: Grid Dropout with Masking Label

Image ordinal classification refers to predicting a discrete target valu...
research
12/08/2020

Efficient Estimation of Influence of a Training Instance

Understanding the influence of a training instance on a neural network m...
research
07/15/2021

Randomized ReLU Activation for Uncertainty Estimation of Deep Neural Networks

Deep neural networks (DNNs) have successfully learned useful data repres...
research
08/11/2018

Dropout during inference as a model for neurological degeneration in an image captioning network

We replicate a variation of the image captioning architecture by Vinyals...
research
05/28/2018

Adaptive Network Sparsification via Dependent Variational Beta-Bernoulli Dropout

While variational dropout approaches have been shown to be effective for...

Please sign up or login with your details

Forgot password? Click here to reset