Towards Faster Non-Asymptotic Convergence for Diffusion-Based Generative Models

06/15/2023
by   Gen Li, et al.
4

Diffusion models, which convert noise into new data instances by learning to reverse a Markov diffusion process, have become a cornerstone in contemporary generative modeling. While their practical power has now been widely recognized, the theoretical underpinnings remain far from mature. In this work, we develop a suite of non-asymptotic theory towards understanding the data generation process of diffusion models in discrete time, assuming access to reliable estimates of the (Stein) score functions. For a popular deterministic sampler (based on the probability flow ODE), we establish a convergence rate proportional to 1/T (with T the total number of steps), improving upon past results; for another mainstream stochastic sampler (i.e., a type of the denoising diffusion probabilistic model (DDPM)), we derive a convergence rate proportional to 1/√(T), matching the state-of-the-art theory. Our theory imposes only minimal assumptions on the target data distribution (e.g., no smoothness assumption is imposed), and is developed based on an elementary yet versatile non-asymptotic approach without resorting to toolboxes for SDEs and ODEs. Further, we design two accelerated variants, improving the convergence to 1/T^2 for the ODE-based sampler and 1/T for the DDPM-type sampler, which might be of independent theoretical and empirical interest.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/22/2022

Sampling is as easy as learning the score: theory for diffusion models with minimal data assumptions

We provide theoretical convergence guarantees for score-based generative...
research
06/08/2023

Interpreting and Improving Diffusion Models Using the Euclidean Distance Function

Denoising is intuitively related to projection. Indeed, under the manifo...
research
03/06/2023

Restoration-Degradation Beyond Linear Diffusions: A Non-Asymptotic Analysis For DDIM-Type Samplers

We develop a framework for non-asymptotic analysis of deterministic samp...
research
08/23/2023

Boosting Diffusion Models with an Adaptive Momentum Sampler

Diffusion probabilistic models (DPMs) have been shown to generate high-q...
research
11/30/2021

Convergence Rate of Multiple-try Metropolis Independent sampler

The Multiple-try Metropolis (MTM) method is an interesting extension of ...
research
05/19/2023

The probability flow ODE is provably fast

We provide the first polynomial-time convergence guarantees for the prob...
research
12/26/2021

Itô-Taylor Sampling Scheme for Denoising Diffusion Probabilistic Models using Ideal Derivatives

Denoising Diffusion Probabilistic Models (DDPMs) have been attracting at...

Please sign up or login with your details

Forgot password? Click here to reset