Discrete Denoising Flows

07/24/2021
by   Alexandra Lindt, et al.
0

Discrete flow-based models are a recently proposed class of generative models that learn invertible transformations for discrete random variables. Since they do not require data dequantization and maximize an exact likelihood objective, they can be used in a straight-forward manner for lossless compression. In this paper, we introduce a new discrete flow-based model for categorical random variables: Discrete Denoising Flows (DDFs). In contrast with other discrete flow-based models, our model can be locally trained without introducing gradient bias. We show that DDFs outperform Discrete Flows on modeling a toy example, binary MNIST and Cityscapes segmentation maps, measured in log-likelihood.

READ FULL TEXT

page 4

page 8

research
06/22/2020

IDF++: Analyzing and Improving Integer Discrete Flows for Lossless Compression

In this paper we analyse and improve integer discrete flows for lossless...
research
01/29/2019

Latent Normalizing Flows for Discrete Sequences

Normalizing flows have been shown to be a powerful class of generative m...
research
06/17/2020

Categorical Normalizing Flows via Continuous Transformations

Despite their popularity, to date, the application of normalizing flows ...
research
06/17/2022

Fast Lossless Neural Compression with Integer-Only Discrete Flows

By applying entropy codecs with learned data distributions, neural compr...
research
06/29/2022

Discrete Langevin Sampler via Wasserstein Gradient Flow

Recently, a family of locally balanced (LB) samplers has demonstrated ex...
research
06/07/2021

Generative Flows with Invertible Attentions

Flow-based generative models have shown excellent ability to explicitly ...
research
03/18/2019

A RAD approach to deep mixture models

Flow based models such as Real NVP are an extremely powerful approach to...

Please sign up or login with your details

Forgot password? Click here to reset