DeepAI AI Chat
Log In Sign Up

Variational Diffusion Models

by   Diederik P. Kingma, et al.

Diffusion-based generative models have demonstrated a capacity for perceptually impressive synthesis, but can they also be great likelihood-based models? We answer this in the affirmative, and introduce a family of diffusion-based generative models that obtain state-of-the-art likelihoods on standard image density estimation benchmarks. Unlike other diffusion-based models, our method allows for efficient optimization of the noise schedule jointly with the rest of the model. We show that the variational lower bound (VLB) simplifies to a remarkably short expression in terms of the signal-to-noise ratio of the diffused data, thereby improving our theoretical understanding of this model class. Using this insight, we prove an equivalence between several models proposed in the literature. In addition, we show that the continuous-time VLB is invariant to the noise schedule, except for the signal-to-noise ratio at its endpoints. This enables us to learn a noise schedule that minimizes the variance of the resulting VLB estimator, leading to faster optimization. Combining these advances with architectural improvements, we obtain state-of-the-art likelihoods on image density estimation benchmarks, outperforming autoregressive models that have dominated these benchmarks for many years, with often significantly faster optimization. In addition, we show how to turn the model into a bits-back compression scheme, and demonstrate lossless compression rates close to the theoretical optimum.


page 2

page 9

page 22

page 23

page 24


Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design

Flow-based generative models are powerful exact likelihood models with e...

MaCow: Masked Convolutional Generative Flow

Flow-based generative models, conceptually attractive due to tractabilit...

Diffusion Models: A Comprehensive Survey of Methods and Applications

Diffusion models are a class of deep generative models that have shown i...

Optimal Variance Control of the Score Function Gradient Estimator for Importance Weighted Bounds

This paper introduces novel results for the score function gradient esti...

One-Line-of-Code Data Mollification Improves Optimization of Likelihood-based Generative Models

Generative Models (GMs) have attracted considerable attention due to the...

Learning Energy-Based Models by Cooperative Diffusion Recovery Likelihood

Training energy-based models (EBMs) with maximum likelihood estimation o...

Analyzing Diffusion as Serial Reproduction

Diffusion models are a class of generative models that learn to synthesi...

Code Repositories


DiffWave with variaitional diffusion models

view repo