Token Merging for Fast Stable Diffusion

03/30/2023
by   Daniel Bolya, et al.
0

The landscape of image generation has been forever changed by open vocabulary diffusion models. However, at their core these models use transformers, which makes generation slow. Better implementations to increase the throughput of these transformers have emerged, but they still evaluate the entire model. In this paper, we instead speed up diffusion models by exploiting natural redundancy in generated images by merging redundant tokens. After making some diffusion-specific improvements to Token Merging (ToMe), our ToMe for Stable Diffusion can reduce the number of tokens in an existing Stable Diffusion model by up to 60 training. In the process, we speed up image generation by up to 2x and reduce memory consumption by up to 5.6x. Furthermore, this speed-up stacks with efficient implementations such as xFormers, minimally impacting quality while being up to 5.4x faster for large images. Code is available at https://github.com/dbolya/tomesd.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

research
09/25/2022

Personalizing Text-to-Image Generation via Aesthetic Gradients

This work proposes aesthetic gradients, a method to personalize a CLIP-c...
research
06/01/2023

The Hidden Language of Diffusion Models

Text-to-image diffusion models have demonstrated an unparalleled ability...
research
04/14/2023

M2T: Masking Transformers Twice for Faster Decoding

We show how bidirectional transformers trained for masked token predicti...
research
11/29/2022

Wavelet Diffusion Models are fast and scalable Image Generators

Diffusion models are rising as a powerful solution for high-fidelity ima...
research
03/25/2023

Masked Diffusion Transformer is a Strong Image Synthesizer

Despite its success in image synthesis, we observe that diffusion probab...
research
05/31/2023

Tree-Ring Watermarks: Fingerprints for Diffusion Images that are Invisible and Robust

Watermarking the outputs of generative models is a crucial technique for...
research
10/17/2022

Token Merging: Your ViT But Faster

We introduce Token Merging (ToMe), a simple method to increase the throu...

Please sign up or login with your details

Forgot password? Click here to reset