Learning Compact Convolutional Neural Networks with Nested Dropout

12/22/2014
by   Chelsea Finn, et al.
0

Recently, nested dropout was proposed as a method for ordering representation units in autoencoders by their information content, without diminishing reconstruction cost. However, it has only been applied to training fully-connected autoencoders in an unsupervised setting. We explore the impact of nested dropout on the convolutional layers in a CNN trained by backpropagation, investigating whether nested dropout can provide a simple and systematic way to determine the optimal representation size with respect to the desired accuracy and desired task and data complexity.

READ FULL TEXT
research
02/09/2015

Efficient batchwise dropout training using submatrices

Dropout is a popular technique for regularizing artificial neural networ...
research
01/27/2021

Bayesian Nested Neural Networks for Uncertainty Calibration and Adaptive Compression

Nested networks or slimmable networks are neural networks whose architec...
research
02/05/2014

Learning Ordered Representations with Nested Dropout

In this paper, we study ordered representations of data in which differe...
research
05/13/2022

Structural Dropout for Model Width Compression

Existing ML models are known to be highly over-parametrized, and use sig...
research
07/13/2020

Regularized linear autoencoders recover the principal components, eventually

Our understanding of learning input-output relationships with neural net...
research
02/07/2020

DropCluster: A structured dropout for convolutional networks

Dropout as a regularizer in deep neural networks has been less effective...
research
07/27/2023

R-Block: Regularized Block of Dropout for convolutional networks

Dropout as a regularization technique is widely used in fully connected ...

Please sign up or login with your details

Forgot password? Click here to reset