The LORACs prior for VAEs: Letting the Trees Speak for the Data

10/16/2018
by   Sharad Vikram, et al.
0

In variational autoencoders, the prior on the latent codes z is often treated as an afterthought, but the prior shapes the kind of latent representation that the model learns. If the goal is to learn a representation that is interpretable and useful, then the prior should reflect the ways in which the high-level factors that describe the data vary. The "default" prior is an isotropic normal, but if the natural factors of variation in the dataset exhibit discrete structure or are not independent, then the isotropic-normal prior will actually encourage learning representations that mask this structure. To alleviate this problem, we propose using a flexible Bayesian nonparametric hierarchical clustering prior based on the time-marginalized coalescent (TMC). To scale learning to large datasets, we develop a new inducing-point approximation and inference algorithm. We then apply the method without supervision to several datasets and examine the interpretability and practical performance of the inferred hierarchies and learned latent space.

READ FULL TEXT

page 11

page 12

page 15

research
03/21/2017

Nonparametric Variational Auto-encoders for Hierarchical Representation Learning

The recently developed variational autoencoders (VAEs) have proved to be...
research
05/13/2019

Learning Hierarchical Priors in VAEs

We propose to learn a hierarchical prior in the context of variational a...
research
01/27/2019

Disentangling in Variational Autoencoders with Natural Clustering

Learning representations that disentangle the underlying factors of vari...
research
04/06/2018

Associative Compression Networks for Representation Learning

This paper introduces Associative Compression Networks (ACNs), a new fra...
research
02/12/2020

Variational Autoencoders with Riemannian Brownian Motion Priors

Variational Autoencoders (VAEs) represent the given data in a low-dimens...
research
03/30/2021

An Improved and Extended Bayesian Synthetic Control

An improved and extended Bayesian synthetic control model is presented, ...
research
04/11/2020

Depthwise Discrete Representation Learning

Recent advancements in learning Discrete Representations as opposed to c...

Please sign up or login with your details

Forgot password? Click here to reset