Direct Optimization through for Discrete Variational Auto-Encoder

06/07/2018
by   Guy Lorberbom, et al.
0

Reparameterization of variational auto-encoders with continuous latent spaces is an effective method for reducing the variance of their gradient estimates. However, using the same approach when latent variables are discrete is problematic, due to the resulting non-differentiable objective. In this work, we present a direct optimization method that propagates gradients through a non-differentiable prediction operation. We apply this method to discrete variational auto-encoders, by modeling a discrete random variable by the function of the Gumbel-Max perturbation model.

READ FULL TEXT

page 11

page 12

research
05/03/2022

Learning Discrete Structured Variational Auto-Encoder using Natural Evolution Strategies

Discrete variational auto-encoders (VAEs) are able to represent semantic...
research
05/18/2017

Spatial Variational Auto-Encoding via Matrix-Variate Normal Distributions

The key idea of variational auto-encoders (VAEs) resembles that of tradi...
research
12/18/2012

Variational Optimization

We discuss a general technique that can be used to form a differentiable...
research
09/11/2022

Adaptive Perturbation-Based Gradient Estimation for Discrete Latent Variable Models

The integration of discrete algorithmic components in deep learning arch...
research
08/05/2021

Sparse Communication via Mixed Distributions

Neural networks and other machine learning models compute continuous rep...
research
11/27/2020

Direct Evolutionary Optimization of Variational Autoencoders With Binary Latents

Discrete latent variables are considered important for real world data, ...
research
06/01/2018

Reparameterization Gradient for Non-differentiable Models

We present a new algorithm for stochastic variational inference that tar...

Please sign up or login with your details

Forgot password? Click here to reset