Training Discrete Deep Generative Models via Gapped Straight-Through Estimator

06/15/2022
by   Ting-Han Fan, et al.
0

While deep generative models have succeeded in image processing, natural language processing, and reinforcement learning, training that involves discrete random variables remains challenging due to the high variance of its gradient estimation process. Monte Carlo is a common solution used in most variance reduction approaches. However, this involves time-consuming resampling and multiple function evaluations. We propose a Gapped Straight-Through (GST) estimator to reduce the variance without incurring resampling overhead. This estimator is inspired by the essential properties of Straight-Through Gumbel-Softmax. We determine these properties and show via an ablation study that they are essential. Experiments demonstrate that the proposed GST estimator enjoys better performance compared to strong baselines on two discrete deep generative modeling tasks, MNIST-VAE and ListOps.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/04/2020

Generalized Gumbel-Softmax Gradient Estimator for Various Discrete Random Variables

Estimating the gradients of stochastic nodes is one of the crucial resea...
research
08/12/2022

Gradient Estimation for Binary Latent Variables via Gradient Variance Clipping

Gradient estimation is often necessary for fitting generative models wit...
research
02/19/2022

Gradient Estimation with Discrete Stein Operators

Gradient estimation – approximating the gradient of an expectation with ...
research
06/15/2021

Divergence Frontiers for Generative Models: Sample Complexity, Quantization Level, and Frontier Integral

The spectacular success of deep generative models calls for quantitative...
research
12/02/2019

KernelNet: A Data-Dependent Kernel Parameterization for Deep Generative Modeling

Learning with kernels is an often resorted tool in modern machine learni...
research
10/05/2020

Goal-directed Generation of Discrete Structures with Conditional Generative Models

Despite recent advances, goal-directed generation of structured discrete...
research
12/18/2018

A Factorial Mixture Prior for Compositional Deep Generative Models

We assume that a high-dimensional datum, like an image, is a composition...

Please sign up or login with your details

Forgot password? Click here to reset