Compacting Neural Network Classifiers via Dropout Training

11/18/2016
by   Yotaro Kubo, et al.
0

We introduce dropout compaction, a novel method for training feed-forward neural networks which realizes the performance gains of training a large model with dropout regularization, yet extracts a compact neural network for run-time efficiency. In the proposed method, we introduce a sparsity-inducing prior on the per unit dropout retention probability so that the optimizer can effectively prune hidden units during training. By changing the prior hyperparameters, we can control the size of the resulting network. We performed a systematic comparison of dropout compaction and competing methods on several real-world speech recognition tasks and found that dropout compaction achieved comparable accuracy with fewer than 50 2.5x speedup in run-time.

READ FULL TEXT
research
08/29/2018

Dropout with Tabu Strategy for Regularizing Deep Neural Networks

Dropout has proven to be an effective technique for regularization and p...
research
10/14/2018

On the relationship between Dropout and Equiangular Tight Frames

Dropout is a popular regularization technique in neural networks. Yet, t...
research
05/28/2018

Adaptive Network Sparsification via Dependent Variational Beta-Bernoulli Dropout

While variational dropout approaches have been shown to be effective for...
research
10/27/2022

Adapting Neural Models with Sequential Monte Carlo Dropout

The ability to adapt to changing environments and settings is essential ...
research
06/20/2017

Analysis of dropout learning regarded as ensemble learning

Deep learning is the state-of-the-art in fields such as visual object re...
research
02/21/2017

Delving Deeper into MOOC Student Dropout Prediction

In order to obtain reliable accuracy estimates for automatic MOOC dropou...
research
01/30/2018

Fast Power system security analysis with Guided Dropout

We propose a new method to efficiently compute load-flows (the steady-st...

Please sign up or login with your details

Forgot password? Click here to reset