Learning Ordered Representations with Nested Dropout

02/05/2014
by   Oren Rippel, et al.
0

In this paper, we study ordered representations of data in which different dimensions have different degrees of importance. To learn these representations we introduce nested dropout, a procedure for stochastically removing coherent nested sets of hidden units in a neural network. We first present a sequence of theoretical results in the simple case of a semi-linear autoencoder. We rigorously show that the application of nested dropout enforces identifiability of the units, which leads to an exact equivalence with PCA. We then extend the algorithm to deep models and demonstrate the relevance of ordered representations to a number of applications. Specifically, we use the ordered property of the learned codes to construct hash-based data structures that permit very fast retrieval, achieving retrieval in time logarithmic in the database size and independent of the dimensionality of the representation. This allows codes that are hundreds of times longer than currently feasible for retrieval. We therefore avoid the diminished quality associated with short codes, while still performing retrieval that is competitive in speed with existing methods. We also show that ordered representations are a promising way to learn adaptive compression for efficient online data reconstruction.

READ FULL TEXT

page 5

page 6

page 8

research
05/25/2023

Ordered and Binary Speaker Embedding

Modern speaker recognition systems represent utterances by embedding vec...
research
12/22/2014

Learning Compact Convolutional Neural Networks with Nested Dropout

Recently, nested dropout was proposed as a method for ordering represent...
research
04/28/2021

Boosting Co-teaching with Compression Regularization for Label Noise

In this paper, we study the problem of learning image classification mod...
research
06/15/2020

Ordering Dimensions with Nested Dropout Normalizing Flows

The latent space of normalizing flows must be of the same dimensionality...
research
02/26/2021

FjORD: Fair and Accurate Federated Learning under heterogeneous targets with Ordered Dropout

Federated Learning (FL) has been gaining significant traction across dif...
research
06/27/2022

Compressing Features for Learning with Noisy Labels

Supervised learning can be viewed as distilling relevant information fro...

Please sign up or login with your details

Forgot password? Click here to reset