Accelerating Diffusion Sampling with Classifier-based Feature Distillation

11/22/2022
by   Wujie Sun, et al.
0

Although diffusion model has shown great potential for generating higher quality images than GANs, slow sampling speed hinders its wide application in practice. Progressive distillation is thus proposed for fast sampling by progressively aligning output images of N-step teacher sampler with N/2-step student sampler. In this paper, we argue that this distillation-based accelerating method can be further improved, especially for few-step samplers, with our proposed Classifier-based Feature Distillation (CFD). Instead of aligning output images, we distill teacher's sharpened feature distribution into the student with a dataset-independent classifier, making the student focus on those important features to improve performance. We also introduce a dataset-oriented loss to further optimize the model. Experiments on CIFAR-10 show the superiority of our method in achieving high quality and fast sampling. Code will be released soon.

READ FULL TEXT

page 4

page 7

research
02/01/2022

Progressive Distillation for Fast Sampling of Diffusion Models

Diffusion models have recently shown great promise for generative modeli...
research
05/18/2023

Catch-Up Distillation: You Only Need to Train Once for Accelerating Sampling

Diffusion Probability Models (DPMs) have made impressive advancements in...
research
10/27/2022

Improved Feature Distillation via Projector Ensemble

In knowledge distillation, previous feature distillation methods mainly ...
research
09/12/2023

InstaFlow: One Step is Enough for High-Quality Diffusion-Based Text-to-Image Generation

Diffusion models have revolutionized text-to-image generation with its e...
research
05/19/2023

Inductive CaloFlow

Simulating particle detector response is the single most expensive step ...
research
09/11/2023

CaloClouds II: Ultra-Fast Geometry-Independent Highly-Granular Calorimeter Simulation

Fast simulation of the energy depositions in high-granular detectors is ...
research
09/15/2021

New Perspective on Progressive GANs Distillation for One-class Novelty Detection

One-class novelty detection is conducted to identify anomalous instances...

Please sign up or login with your details

Forgot password? Click here to reset