Variational Dropout Sparsifies Deep Neural Networks

01/19/2017
by   Dmitry Molchanov, et al.
0

We explore a recently proposed Variational Dropout technique that provided an elegant Bayesian interpretation to Gaussian Dropout. We extend Variational Dropout to the case when dropout rates are unbounded, propose a way to reduce the variance of the gradient estimator and report first experimental results with individual dropout rates per weight. Interestingly, it leads to extremely sparse solutions both in fully-connected and convolutional layers. This effect is similar to automatic relevance determination effect in empirical Bayes but has a number of advantages. We reduce the number of parameters up to 280 times on LeNet architectures and up to 68 times on VGG-like networks with a negligible decrease of accuracy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/01/2018

Variational Dropout via Empirical Bayes

We study the Automatic Relevance Determination procedure applied to deep...
research
02/09/2015

Efficient batchwise dropout training using submatrices

Dropout is a popular technique for regularizing artificial neural networ...
research
06/08/2015

Variational Dropout and the Local Reparameterization Trick

We investigate a local reparameterizaton technique for greatly reducing ...
research
02/12/2020

Learnable Bernoulli Dropout for Bayesian Deep Learning

In this work, we propose learnable Bernoulli dropout (LBD), a new model-...
research
12/21/2013

An empirical analysis of dropout in piecewise linear networks

The recently introduced dropout training criterion for neural networks h...
research
06/14/2021

The Flip Side of the Reweighted Coin: Duality of Adaptive Dropout and Regularization

Among the most successful methods for sparsifying deep (neural) networks...
research
11/02/2018

Analysing Dropout and Compounding Errors in Neural Language Models

This paper carries out an empirical analysis of various dropout techniqu...

Please sign up or login with your details

Forgot password? Click here to reset