Balancing Reconstruction Quality and Regularisation in ELBO for VAEs

09/09/2019
by   Shuyu Lin, et al.
18

A trade-off exists between reconstruction quality and the prior regularisation in the Evidence Lower Bound (ELBO) loss that Variational Autoencoder (VAE) models use for learning. There are few satisfactory approaches to deal with a balance between the prior and reconstruction objective, with most methods dealing with this problem through heuristics. In this paper, we show that the noise variance (often set as a fixed value) in the Gaussian likelihood p(x|z) for real-valued data can naturally act to provide such a balance. By learning this noise variance so as to maximise the ELBO loss, we automatically obtain an optimal trade-off between the reconstruction error and the prior constraint on the posteriors. This variance can be interpreted intuitively as the necessary noise level for the current model to be the best explanation of the observed dataset. Further, by allowing the variance inference to be more flexible it can conveniently be used as an uncertainty estimator for reconstructed or generated samples. We demonstrate that optimising the noise variance is a crucial component of VAE learning, and showcase the performance on MNIST, Fashion MNIST and CelebA datasets. We find our approach can significantly improve the quality of generated samples whilst maintaining a smooth latent-space manifold to represent the data. The method also offers an indication of uncertainty in the final generative model.

READ FULL TEXT

page 13

page 14

page 15

page 19

page 20

page 21

page 22

page 23

research
11/14/2022

Disentangling Variational Autoencoders

A variational autoencoder (VAE) is a probabilistic machine learning fram...
research
10/18/2020

Addressing Variance Shrinkage in Variational Autoencoders using Quantile Regression

Estimation of uncertainty in deep learning models is of vital importance...
research
02/23/2020

Variance Loss in Variational Autoencoders

In this article, we highlight what appears to be major issue of Variatio...
research
01/17/2022

Lifelong Generative Learning via Knowledge Reconstruction

Generative models often incur the catastrophic forgetting problem when t...
research
02/17/2021

Preventing Posterior Collapse Induced by Oversmoothing in Gaussian VAE

Variational autoencoders (VAEs) often suffer from posterior collapse, wh...
research
09/15/2020

Challenging β-VAE with β< 1 for Disentanglement Via Dynamic Learning

This paper challenges the common assumption that the weight of β-VAE sho...
research
02/18/2020

Balancing reconstruction error and Kullback-Leibler divergence in Variational Autoencoders

In the loss function of Variational Autoencoders there is a well known t...

Please sign up or login with your details

Forgot password? Click here to reset