Learning Energy-Based Models by Cooperative Diffusion Recovery Likelihood

09/10/2023
by   Yaxuan Zhu, et al.
0

Training energy-based models (EBMs) with maximum likelihood estimation on high-dimensional data can be both challenging and time-consuming. As a result, there a noticeable gap in sample quality between EBMs and other generative frameworks like GANs and diffusion models. To close this gap, inspired by the recent efforts of learning EBMs by maximimizing diffusion recovery likelihood (DRL), we propose cooperative diffusion recovery likelihood (CDRL), an effective approach to tractably learn and sample from a series of EBMs defined on increasingly noisy versons of a dataset, paired with an initializer model for each EBM. At each noise level, the initializer model learns to amortize the sampling process of the EBM, and the two models are jointly estimated within a cooperative training framework. Samples from the initializer serve as starting points that are refined by a few sampling steps from the EBM. With the refined samples, the EBM is optimized by maximizing recovery likelihood, while the initializer is optimized by learning from the difference between the refined samples and the initial samples. We develop a new noise schedule and a variance reduction technique to further improve the sample quality. Combining these advances, we significantly boost the FID scores compared to existing EBM methods on CIFAR-10 and ImageNet 32x32, with a 2x speedup over DRL. In addition, we extend our method to compositional generation and image inpainting tasks, and showcase the compatibility of CDRL with classifier-free guidance for conditional generation, achieving similar trade-offs between sample quality and sample diversity as in diffusion models.

READ FULL TEXT

page 7

page 18

page 22

page 23

page 24

page 25

page 26

page 27

research
07/26/2022

Classifier-Free Diffusion Guidance

Classifier guidance is a recently introduced method to trade off mode co...
research
10/19/2022

Autoregressive Generative Modeling with Noise Conditional Maximum Likelihood Estimation

We introduce a simple modification to the standard maximum likelihood es...
research
02/11/2022

Learning Fast Samplers for Diffusion Models by Differentiating Through Sample Quality

Diffusion models have emerged as an expressive family of generative mode...
research
01/29/2023

Don't Play Favorites: Minority Guidance for Diffusion Models

We explore the problem of generating minority samples using diffusion mo...
research
04/21/2023

Persistently Trained, Diffusion-assisted Energy-based Models

Maximum likelihood (ML) learning for energy-based models (EBMs) is chall...
research
02/07/2019

Multimodal Conditional Learning with Fast Thinking Policy-like Model and Slow Thinking Planner-like Model

This paper studies the supervised learning of the conditional distributi...
research
07/01/2021

Variational Diffusion Models

Diffusion-based generative models have demonstrated a capacity for perce...

Please sign up or login with your details

Forgot password? Click here to reset