Addressing Negative Transfer in Diffusion Models

06/01/2023
by   Hyojun Go, et al.
0

Diffusion-based generative models have achieved remarkable success in various domains. It trains a model on denoising tasks that encompass different noise levels simultaneously, representing a form of multi-task learning (MTL). However, analyzing and improving diffusion models from an MTL perspective remains under-explored. In particular, MTL can sometimes lead to the well-known phenomenon of negative transfer, which results in the performance degradation of certain tasks due to conflicts between tasks. In this paper, we aim to analyze diffusion training from an MTL standpoint, presenting two key observations: (O1) the task affinity between denoising tasks diminishes as the gap between noise levels widens, and (O2) negative transfer can arise even in the context of diffusion training. Building upon these observations, our objective is to enhance diffusion training by mitigating negative transfer. To achieve this, we propose leveraging existing MTL methods, but the presence of a huge number of denoising tasks makes this computationally expensive to calculate the necessary per-task loss or gradient. To address this challenge, we propose clustering the denoising tasks into small task clusters and applying MTL methods to them. Specifically, based on (O2), we employ interval clustering to enforce temporal proximity among denoising tasks within clusters. We show that interval clustering can be solved with dynamic programming and utilize signal-to-noise ratio, timestep, and task affinity for clustering objectives. Through this, our approach addresses the issue of negative transfer in diffusion models by allowing for efficient computation of MTL methods. We validate the proposed clustering and its integration with MTL methods through various experiments, demonstrating improved sample quality of diffusion models.

READ FULL TEXT

page 4

page 16

page 19

page 20

page 21

research
10/10/2021

Denoising Diffusion Gamma Models

Generative diffusion processes are an emerging and effective tool for im...
research
05/31/2022

On Analyzing Generative and Denoising Capabilities of Diffusion-based Deep Generative Models

Diffusion-based Deep Generative Models (DDGMs) offer state-of-the-art pe...
research
06/11/2021

PriorGrad: Improving Conditional Denoising Diffusion Models with Data-Driven Adaptive Prior

Denoising diffusion probabilistic models have been recently proposed to ...
research
05/18/2023

PTQD: Accurate Post-Training Quantization for Diffusion Models

Diffusion models have recently dominated image synthesis and other relat...
research
03/27/2023

Exploring Continual Learning of Diffusion Models

Diffusion models have achieved remarkable success in generating high-qua...
research
03/16/2023

Efficient Diffusion Training via Min-SNR Weighting Strategy

Denoising diffusion models have been a mainstream approach for image gen...
research
09/15/2023

Probabilistic Constellation Shaping With Denoising Diffusion Probabilistic Models: A Novel Approach

With the incredible results achieved from generative pre-trained transfo...

Please sign up or login with your details

Forgot password? Click here to reset