Diff-Instruct: A Universal Approach for Transferring Knowledge From Pre-trained Diffusion Models

05/29/2023
by   Weijian Luo, et al.
0

Due to the ease of training, ability to scale, and high sample quality, diffusion models (DMs) have become the preferred option for generative modeling, with numerous pre-trained models available for a wide variety of datasets. Containing intricate information about data distributions, pre-trained DMs are valuable assets for downstream applications. In this work, we consider learning from pre-trained DMs and transferring their knowledge to other generative models in a data-free fashion. Specifically, we propose a general framework called Diff-Instruct to instruct the training of arbitrary generative models as long as the generated samples are differentiable with respect to the model parameters. Our proposed Diff-Instruct is built on a rigorous mathematical foundation where the instruction process directly corresponds to minimizing a novel divergence we call Integral Kullback-Leibler (IKL) divergence. IKL is tailored for DMs by calculating the integral of the KL divergence along a diffusion process, which we show to be more robust in comparing distributions with misaligned supports. We also reveal non-trivial connections of our method to existing works such as DreamFusion, and generative adversarial training. To demonstrate the effectiveness and universality of Diff-Instruct, we consider two scenarios: distilling pre-trained diffusion models and refining existing GAN models. The experiments on distilling pre-trained diffusion models show that Diff-Instruct results in state-of-the-art single-step diffusion-based models. The experiments on refining GAN models show that the Diff-Instruct can consistently improve the pre-trained generators of GAN models across various settings.

READ FULL TEXT

page 8

page 20

page 21

research
02/11/2022

Learning Fast Samplers for Diffusion Models by Differentiating Through Sample Quality

Diffusion models have emerged as an expressive family of generative mode...
research
12/11/2022

How to Backdoor Diffusion Models?

Diffusion models are state-of-the-art deep learning empowered generative...
research
07/31/2023

Universal Adversarial Defense in Remote Sensing Based on Pre-trained Denoising Diffusion Models

Deep neural networks (DNNs) have achieved tremendous success in many rem...
research
05/18/2023

Democratized Diffusion Language Model

Despite the potential benefits of Diffusion Models for NLP applications,...
research
12/26/2022

Unsupervised Representation Learning from Pre-trained Diffusion Probabilistic Models

Diffusion Probabilistic Models (DPMs) have shown a powerful capacity of ...
research
09/14/2023

Beta Diffusion

We introduce beta diffusion, a novel generative modeling method that int...
research
07/29/2023

Fingerprints of Generative Models in the Frequency Domain

It is verified in existing works that CNN-based generative models leave ...

Please sign up or login with your details

Forgot password? Click here to reset