Diffusion Priors In Variational Autoencoders

06/29/2021
by   Antoine Wehenkel, et al.
10

Among likelihood-based approaches for deep generative modelling, variational autoencoders (VAEs) offer scalable amortized posterior inference and fast sampling. However, VAEs are also more and more outperformed by competing models such as normalizing flows (NFs), deep-energy models, or the new denoising diffusion probabilistic models (DDPMs). In this preliminary work, we improve VAEs by demonstrating how DDPMs can be used for modelling the prior distribution of the latent variables. The diffusion prior model improves upon Gaussian priors of classical VAEs and is competitive with NF-based priors. Finally, we hypothesize that hierarchical VAEs could similarly benefit from the enhanced capacity of diffusion priors.

READ FULL TEXT
research
05/24/2023

Dior-CVAE: Diffusion Priors in Variational Dialog Generation

Conditional variational autoencoders (CVAEs) have been used recently for...
research
11/14/2022

Denoising Diffusion Models for Out-of-Distribution Detection

Out-of-distribution detection is crucial to the safe deployment of machi...
research
09/10/2019

Learning Priors for Adversarial Autoencoders

Most deep latent factor models choose simple priors for simplicity, trac...
research
01/12/2023

Thompson Sampling with Diffusion Generative Prior

In this work, we initiate the idea of using denoising diffusion models t...
research
10/26/2018

Resampled Priors for Variational Autoencoders

We propose Learned Accept/Reject Sampling (LARS), a method for construct...
research
09/05/2023

Efficient Bayesian Computational Imaging with a Surrogate Score-Based Prior

We propose a surrogate function for efficient use of score-based priors ...
research
05/25/2023

Revisiting Structured Variational Autoencoders

Structured variational autoencoders (SVAEs) combine probabilistic graphi...

Please sign up or login with your details

Forgot password? Click here to reset