Maximum Likelihood Training of Implicit Nonlinear Diffusion Models

05/27/2022
by   Dongjun Kim, et al.
0

Whereas diverse variations of diffusion models exist, expanding the linear diffusion into a nonlinear diffusion process is investigated only by a few works. The nonlinearity effect has been hardly understood, but intuitively, there would be more promising diffusion patterns to optimally train the generative distribution towards the data distribution. This paper introduces such a data-adaptive and nonlinear diffusion process for score-based diffusion models. The proposed Implicit Nonlinear Diffusion Model (INDM) learns the nonlinear diffusion process by combining a normalizing flow and a diffusion process. Specifically, INDM implicitly constructs a nonlinear diffusion on the data space by leveraging a linear diffusion on the latent space through a flow network. This flow network is the key to forming a nonlinear diffusion as the nonlinearity fully depends on the flow network. This flexible nonlinearity is what improves the learning curve of INDM to nearly MLE training, compared against the non-MLE training of DDPM++, which turns out to be a special case of INDM with the identity flow. Also, training the nonlinear diffusion empirically yields a sampling-friendly latent diffusion that the sample trajectory of INDM is closer to an optimal transport than the trajectories of previous research. In experiments, INDM achieves the state-of-the-art FID on CelebA.

READ FULL TEXT

page 4

page 9

page 42

research
02/14/2022

Understanding DDPM Latent Codes Through Optimal Transport

Diffusion models have recently outperformed alternative approaches to mo...
research
09/03/2023

Diffusion Models with Deterministic Normalizing Flow Priors

For faster sampling and higher sample quality, we propose DiNof (Diffusi...
research
10/14/2021

Diffusion Normalizing Flow

We present a novel generative modeling method called diffusion normalizi...
research
10/06/2022

Flow Matching for Generative Modeling

We introduce a new paradigm for generative modeling built on Continuous ...
research
04/25/2023

The Score-Difference Flow for Implicit Generative Modeling

Implicit generative modeling (IGM) aims to produce samples of synthetic ...
research
08/31/2022

Let us Build Bridges: Understanding and Extending Diffusion Generative Models

Diffusion-based generative models have achieved promising results recent...
research
06/27/2023

Easing Color Shifts in Score-Based Diffusion Models

Generated images of score-based models can suffer from errors in their s...

Please sign up or login with your details

Forgot password? Click here to reset