DVAE#: Discrete Variational Autoencoders with Relaxed Boltzmann Priors

05/18/2018
by   Arash Vahdat, et al.
0

Boltzmann machines are powerful distributions that have been shown to be an effective prior over binary latent variables in variational autoencoders (VAEs). However, previous methods for training discrete VAEs have used the evidence lower bound and not the tighter importance-weighted bound. We propose two approaches for relaxing Boltzmann machines to continuous distributions that permit training with importance-weighted bounds. These relaxations are based on generalized overlapping transformations and the Gaussian integral trick. Experiments on the MNIST and OMNIGLOT datasets show that these relaxations outperform previous discrete VAEs with Boltzmann priors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/14/2018

DVAE++: Discrete Variational Autoencoders with Overlapping Transformations

Training of discrete latent variable models remains challenging because ...
research
05/18/2018

GumBolt: Extending Gumbel trick to Boltzmann priors

Boltzmann machines (BMs) are appealing candidates for powerful priors in...
research
04/10/2017

Reinterpreting Importance-Weighted Autoencoders

The standard interpretation of importance-weighted autoencoders is that ...
research
08/01/2014

Thurstonian Boltzmann Machines: Learning from Multiple Inequalities

We introduce Thurstonian Boltzmann Machines (TBM), a unified architectur...
research
01/16/2013

Joint Training Deep Boltzmann Machines for Classification

We introduce a new method for training deep Boltzmann machines jointly. ...
research
01/14/2021

A Metaheuristic-Driven Approach to Fine-Tune Deep Boltzmann Machines

Deep learning techniques, such as Deep Boltzmann Machines (DBMs), have r...
research
03/15/2012

Intracluster Moves for Constrained Discrete-Space MCMC

This paper addresses the problem of sampling from binary distributions w...

Please sign up or login with your details

Forgot password? Click here to reset