Towards Accurate Data-free Quantization for Diffusion Models

05/30/2023
by   Changyuan Wang, et al.
0

In this paper, we propose an accurate data-free post-training quantization framework of diffusion models (ADP-DM) for efficient image generation. Conventional data-free quantization methods learn shared quantization functions for tensor discretization regardless of the generation timesteps, while the activation distribution differs significantly across various timesteps. The calibration images are acquired in random timesteps which fail to provide sufficient information for generalizable quantization function learning. Both issues cause sizable quantization errors with obvious image generation performance degradation. On the contrary, we design group-wise quantization functions for activation discretization in different timesteps and sample the optimal timestep for informative calibration image generation, so that our quantized diffusion model can reduce the discretization errors with negligible computational overhead. Specifically, we partition the timesteps according to the importance weights of quantization functions in different groups, which are optimized by differentiable search algorithms. We also select the optimal timestep for calibration image generation by structural risk minimizing principle in order to enhance the generalization ability in the deployment of quantized diffusion model. Extensive experimental results show that our method outperforms the state-of-the-art post-training quantization of diffusion model by a sizable margin with similar computational cost.

READ FULL TEXT

page 2

page 4

page 9

page 13

page 14

research
06/04/2023

Temporal Dynamic Quantization for Diffusion Models

The diffusion model has gained popularity in vision applications due to ...
research
08/25/2022

Efficient Activation Quantization via Adaptive Rounding Border for Post-Training Quantization

Post-training quantization (PTQ) attracts increasing attention due to it...
research
05/10/2023

Post-training Model Quantization Using GANs for Synthetic Data Generation

Quantization is a widely adopted technique for deep neural networks to r...
research
04/05/2020

Feature Quantization Improves GAN Training

The instability in GAN training has been a long-standing problem despite...
research
02/08/2023

Q-Diffusion: Quantizing Diffusion Models

Diffusion models have achieved great success in synthesizing diverse and...
research
11/28/2022

Post-training Quantization on Diffusion Models

Denoising diffusion (score-based) generative models have recently achiev...
research
03/13/2023

Adaptive Data-Free Quantization

Data-free quantization (DFQ) recovers the performance of quantized netwo...

Please sign up or login with your details

Forgot password? Click here to reset