Structural Pruning for Diffusion Models

05/18/2023
by   Gongfan Fang, et al.
0

Generative modeling has recently undergone remarkable advancements, primarily propelled by the transformative implications of Diffusion Probabilistic Models (DPMs). The impressive capability of these models, however, often entails significant computational overhead during both training and inference. To tackle this challenge, we present Diff-Pruning, an efficient compression method tailored for learning lightweight diffusion models from pre-existing ones, without the need for extensive re-training. The essence of Diff-Pruning is encapsulated in a Taylor expansion over pruned timesteps, a process that disregards non-contributory diffusion steps and ensembles informative gradients to identify important weights. Our empirical assessment, undertaken across four diverse datasets highlights two primary benefits of our proposed method: 1) Efficiency: it enables approximately a 50 20 models inherently preserve generative behavior congruent with their pre-trained progenitors. Code is available at <https://github.com/VainF/Diff-Pruning>.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset