The Neglected Sibling: Isotropic Gaussian Posterior for VAE

10/14/2021
by   Lan Zhang, et al.
0

Deep generative models have been widely used in several areas of NLP, and various techniques have been proposed to augment them or address their training challenges. In this paper, we propose a simple modification to Variational Autoencoders (VAEs) by using an Isotropic Gaussian Posterior (IGP) that allows for better utilisation of their latent representation space. This model avoids the sub-optimal behavior of VAEs related to inactive dimensions in the representation space. We provide both theoretical analysis, and empirical evidence on various datasets and tasks that show IGP leads to consistent improvement on several quantitative and qualitative grounds, from downstream task performance and sample efficiency to robustness. Additionally, we give insights about the representational properties encouraged by IGP and also show that its gain generalises to image domain as well.

READ FULL TEXT

page 4

page 6

research
05/24/2018

Cross Domain Image Generation through Latent Space Exploration with Adversarial Loss

Conditional domain generation is a good way to interactively control sam...
research
02/20/2023

Analyzing the Posterior Collapse in Hierarchical Variational Autoencoders

Hierarchical Variational Autoencoders (VAEs) are among the most popular ...
research
09/22/2021

LDC-VAE: A Latent Distribution Consistency Approach to Variational AutoEncoders

Variational autoencoders (VAEs), as an important aspect of generative mo...
research
03/18/2022

Defending Variational Autoencoders from Adversarial Attacks with MCMC

Variational autoencoders (VAEs) are deep generative models used in vario...
research
05/19/2017

VAE with a VampPrior

Many different methods to train deep generative models have been introdu...
research
02/09/2020

Out-of-Distribution Detection with Distance Guarantee in Deep Generative Models

Recent research has shown that it is challenging to detect out-of-distri...
research
02/17/2021

Preventing Posterior Collapse Induced by Oversmoothing in Gaussian VAE

Variational autoencoders (VAEs) often suffer from posterior collapse, wh...

Please sign up or login with your details

Forgot password? Click here to reset