DeepAI AI Chat
Log In Sign Up

Hub-VAE: Unsupervised Hub-based Regularization of Variational Autoencoders

by   Priya Mani, et al.

Exemplar-based methods rely on informative data points or prototypes to guide the optimization of learning algorithms. Such data facilitate interpretable model design and prediction. Of particular interest is the utility of exemplars in learning unsupervised deep representations. In this paper, we leverage hubs, which emerge as frequent neighbors in high-dimensional spaces, as exemplars to regularize a variational autoencoder and to learn a discriminative embedding for unsupervised down-stream tasks. We propose an unsupervised, data-driven regularization of the latent space with a mixture of hub-based priors and a hub-based contrastive loss. Experimental evaluation shows that our algorithm achieves superior cluster separability in the embedding space, and accurate data reconstruction and generation, compared to baselines and state-of-the-art techniques.


page 3

page 7

page 12

page 13


Dueling Decoders: Regularizing Variational Autoencoder Latent Spaces

Variational autoencoders learn unsupervised data representations, but th...

Variational Autoencoder-Based Vehicle Trajectory Prediction with an Interpretable Latent Space

This paper introduces the Descriptive Variational Autoencoder (DVAE), an...

Unsupervised Learning of slow features for Data Efficient Regression

Research in computational neuroscience suggests that the human brain's u...

The Utility of Decorrelating Colour Spaces in Vector Quantised Variational Autoencoders

Vector quantised variational autoencoders (VQ-VAE) are characterised by ...

Learning Semantic Embedding Spaces for Slicing Vegetables

In this work, we present an interaction-based approach to learn semantic...

Learning Robust Representation for Clustering through Locality Preserving Variational Discriminative Network

Clustering is one of the fundamental problems in unsupervised learning. ...

Bag-of-Vectors Autoencoders for Unsupervised Conditional Text Generation

Text autoencoders are often used for unsupervised conditional text gener...