Latent Normalizing Flows for Discrete Sequences

01/29/2019
by   Zachary M. Ziegler, et al.
0

Normalizing flows have been shown to be a powerful class of generative models for continuous random variables, giving both strong performance and the potential for non-autoregressive generation. These benefits are also desired when modeling discrete random variables such as text, but directly applying normalizing flows to discrete sequences poses significant additional challenges. We propose a generative model which jointly learns a normalizing flow-based distribution in the latent space and a stochastic mapping to an observed discrete space. In this setting, we find that it is crucial for the flow-based distribution to be highly multimodal. To capture this property, we propose several normalizing flow architectures to maximize model flexibility. Experiments consider common discrete sequence tasks of character-level language modeling and polyphonic music generation. Our results indicate that an autoregressive flow-based model can match the performance of a comparable autoregressive baseline, and a non-autoregressive flow-based model can improve generation speed with a penalty to performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2019

Discrete Flows: Invertible Generative Models of Discrete Data

While normalizing flows have led to significant advances in modeling hig...
research
12/24/2020

RBM-Flow and D-Flow: Invertible Flows with Discrete Energy Base Spaces

Efficient sampling of complex data distributions can be achieved using t...
research
07/24/2021

Discrete Denoising Flows

Discrete flow-based models are a recently proposed class of generative m...
research
02/06/2020

Closing the Dequantization Gap: PixelCNN as a Single-Layer Flow

Flow models have recently made great progress at modeling quantized sens...
research
06/17/2020

Towards Recurrent Autoregressive Flow Models

Stochastic processes generated by non-stationary distributions are diffi...
research
06/12/2018

Deep State Space Models for Unconditional Word Generation

Autoregressive feedback is considered a necessity for successful uncondi...
research
06/22/2020

IDF++: Analyzing and Improving Integer Discrete Flows for Lossless Compression

In this paper we analyse and improve integer discrete flows for lossless...

Please sign up or login with your details

Forgot password? Click here to reset