ReCAB-VAE: Gumbel-Softmax Variational Inference Based on Analytic Divergence

05/09/2022
by   Sangshin Oh, et al.
3

The Gumbel-softmax distribution, or Concrete distribution, is often used to relax the discrete characteristics of a categorical distribution and enable back-propagation through differentiable reparameterization. Although it reliably yields low variance gradients, it still relies on a stochastic sampling process for optimization. In this work, we present a relaxed categorical analytic bound (ReCAB), a novel divergence-like metric which corresponds to the upper bound of the Kullback-Leibler divergence (KLD) of a relaxed categorical distribution. The proposed metric is easy to implement because it has a closed form solution, and empirical results show that it is close to the actual KLD. Along with this new metric, we propose a relaxed categorical analytic bound variational autoencoder (ReCAB-VAE) that successfully models both continuous and relaxed discrete latent representations. We implement an emotional text-to-speech synthesis system based on the proposed framework, and show that the proposed system flexibly and stably controls emotion expressions with better speech quality compared to baselines that use stochastic estimation or categorical distribution approximation.

READ FULL TEXT

page 6

page 8

page 13

page 14

research
11/03/2016

Categorical Reparameterization with Gumbel-Softmax

Categorical variables are a natural choice for representing discrete str...
research
06/15/2020

Gradient Estimation with Stochastic Softmax Tricks

The Gumbel-Max trick is the basis of many relaxed gradient estimators. T...
research
06/15/2020

Robust Variational Autoencoder for Tabular Data with Beta Divergence

We propose a robust variational autoencoder with β divergence for tabula...
research
07/26/2023

Learning Disentangled Discrete Representations

Recent successes in image generation, model-based reinforcement learning...
research
04/06/2018

Expressive Speech Synthesis via Modeling Expressions with Variational Autoencoder

Recent advances in neural autoregressive models have improve the perform...
research
02/12/2018

Augment and Reduce: Stochastic Inference for Large Categorical Distributions

Categorical distributions are ubiquitous in machine learning, e.g., in c...
research
07/20/2020

It's LeVAsa not LevioSA! Latent Encodings for Valence-Arousal Structure Alignment

In recent years, great strides have been made in the field of affective ...

Please sign up or login with your details

Forgot password? Click here to reset