Unscented Autoencoder

06/08/2023
by   Faris Janjoš, et al.
0

The Variational Autoencoder (VAE) is a seminal approach in deep generative modeling with latent variables. Interpreting its reconstruction process as a nonlinear transformation of samples from the latent posterior distribution, we apply the Unscented Transform (UT) – a well-known distribution approximation used in the Unscented Kalman Filter (UKF) from the field of filtering. A finite set of statistics called sigma points, sampled deterministically, provides a more informative and lower-variance posterior representation than the ubiquitous noise-scaling of the reparameterization trick, while ensuring higher-quality reconstruction. We further boost the performance by replacing the Kullback-Leibler (KL) divergence with the Wasserstein distribution metric that allows for a sharper posterior. Inspired by the two components, we derive a novel, deterministic-sampling flavor of the VAE, the Unscented Autoencoder (UAE), trained purely with regularization-like terms on the per-sample posterior. We empirically show competitive performance in Fréchet Inception Distance (FID) scores over closely-related models, in addition to a lower training variance than the VAE.

READ FULL TEXT

page 9

page 22

research
01/30/2019

Enhanced Variational Inference with Dyadic Transformation

Variational autoencoder is a powerful deep generative model with variati...
research
06/29/2020

VAE-KRnet and its applications to variational Bayes

In this work, we have proposed a generative model for density estimation...
research
07/30/2020

Quantitative Understanding of VAE by Interpreting ELBO as Rate Distortion Cost of Transform Coding

VAE (Variational autoencoder) estimates the posterior parameters (mean a...
research
12/23/2019

The Usual Suspects? Reassessing Blame for VAE Posterior Collapse

In narrow asymptotic settings Gaussian VAE models of continuous data hav...
research
04/30/2020

Preventing Posterior Collapse with Levenshtein Variational Autoencoder

Variational autoencoders (VAEs) are a standard framework for inducing la...
research
02/17/2021

Preventing Posterior Collapse Induced by Oversmoothing in Gaussian VAE

Variational autoencoders (VAEs) often suffer from posterior collapse, wh...
research
01/31/2020

CosmoVAE: Variational Autoencoder for CMB Image Inpainting

Cosmic microwave background radiation (CMB) is critical to the understan...

Please sign up or login with your details

Forgot password? Click here to reset