Beyond In-Place Corruption: Insertion and Deletion In Denoising Probabilistic Models

07/16/2021
by   Daniel D. Johnson, et al.
5

Denoising diffusion probabilistic models (DDPMs) have shown impressive results on sequence generation by iteratively corrupting each example and then learning to map corrupted versions back to the original. However, previous work has largely focused on in-place corruption, adding noise to each pixel or token individually while keeping their locations the same. In this work, we consider a broader class of corruption processes and denoising models over sequence data that can insert and delete elements, while still being efficient to train and sample from. We demonstrate that these models outperform standard in-place models on an arithmetic sequence task, and that when trained on the text8 dataset they can be used to fix spelling errors without any fine-tuning.

READ FULL TEXT
research
01/31/2023

Optimizing DDPM Sampling with Shortcut Fine-Tuning

In this study, we propose Shortcut Fine-tuning (SFT), a new approach for...
research
06/28/2023

SVNR: Spatially-variant Noise Removal with Denoising Diffusion

Denoising diffusion models have recently shown impressive results in gen...
research
08/22/2019

Denoising based Sequence-to-Sequence Pre-training for Text Generation

This paper presents a new sequence-to-sequence (seq2seq) pre-training me...
research
07/07/2021

Structured Denoising Diffusion Models in Discrete State-Spaces

Denoising diffusion probabilistic models (DDPMs) (Ho et al. 2020) have s...
research
12/06/2022

Denoising diffusion probabilistic models for probabilistic energy forecasting

Scenario-based probabilistic forecasts have become a vital tool to equip...
research
02/16/2023

Boundary Guided Mixing Trajectory for Semantic Control with Diffusion Models

Applying powerful generative denoising diffusion models (DDMs) for downs...

Please sign up or login with your details

Forgot password? Click here to reset