Flow++: Improving Flow-Based Generative Models with Variational Dequantization and Architecture Design

02/01/2019
by   Jonathan Ho, et al.
97

Flow-based generative models are powerful exact likelihood models with efficient sampling and inference. Despite their computational efficiency, flow-based models generally have much worse density modeling performance compared to state-of-the-art autoregressive models. In this paper, we investigate and improve upon three limiting design choices employed by flow-based models in prior work: the use of uniform noise for dequantization, the use of inexpressive affine flows, and the use of purely convolutional conditioning networks in coupling layers. Based on our findings, we propose Flow++, a new flow-based model that is now the state-of-the-art non-autoregressive model for unconditional density estimation on standard image benchmarks. Our work has begun to close the significant performance gap that has so far existed between autoregressive models and flow-based models. Our implementation is available at https://github.com/aravind0706/flowpp.

READ FULL TEXT

page 7

page 8

page 11

page 12

page 13

page 14

page 15

page 16

05/08/2019

Generative Model with Dynamic Linear Flow

Flow-based generative models are a family of exact log-likelihood models...
04/08/2020

Normalizing Flows with Multi-Scale Autoregressive Priors

Flow-based generative models are an important class of exact inference m...
09/16/2018

f-VAEs: Improve VAEs with Conditional Flows

In this paper, we integrate VAEs and flow-based generative models succes...
07/01/2021

Variational Diffusion Models

Diffusion-based generative models have demonstrated a capacity for perce...
08/17/2023

Fast Inference and Update of Probabilistic Density Estimation on Trajectory Prediction

Safety-critical applications such as autonomous vehicles and social robo...
06/08/2021

Densely connected normalizing flows

Normalizing flows are bijective mappings between inputs and latent repre...
05/21/2019

Compression with Flows via Local Bits-Back Coding

Likelihood-based generative models are the backbones of lossless compres...

Code Repositories

flowpp

Code for reproducing Flow ++ experiments


view repo

localbitsback

Compression with Flows via Local Bits-Back Coding


view repo

AudioSourceSep

Statistics MSc Project (2020): Audio Source Separation


view repo

variational-conv-dequantization

PyTorch implementation for variational dequantization using convolutions


view repo

Please sign up or login with your details

Forgot password? Click here to reset