
Flow++: Improving FlowBased Generative Models with Variational Dequantization and Architecture Design
Flowbased generative models are powerful exact likelihood models with e...
02/01/2019 ∙ by Jonathan Ho, et al. ∙ 97 ∙ shareread it

Generative Model with Dynamic Linear Flow
Flowbased generative models are a family of exact loglikelihood models...
05/08/2019 ∙ by Huadong Liao, et al. ∙ 14 ∙ shareread it

Integer Discrete Flows and Lossless Compression
Lossless compression methods shorten the expected representation size of...
05/17/2019 ∙ by Emiel Hoogeboom, et al. ∙ 7 ∙ shareread it

Deep Diffeomorphic Normalizing Flows
The Normalizing Flow (NF) models a general probability density by estima...
10/08/2018 ∙ by Hadi Salman, et al. ∙ 6 ∙ shareread it

Understanding the (un)interpretability of natural image distributions using generative models
Probability density estimation is a classical and well studied problem, ...
01/06/2019 ∙ by Ryen Krusinga, et al. ∙ 20 ∙ shareread it

Unconstrained Monotonic Neural Networks
Monotonic neural networks have recently been proposed as a way to define...
08/14/2019 ∙ by Antoine Wehenkel, et al. ∙ 0 ∙ shareread it

Neural Spline Flows
A normalizing flow models a complex probability density as an invertible...
06/10/2019 ∙ by Conor Durkan, et al. ∙ 5 ∙ shareread it
Residual Flows for Invertible Generative Modeling
Flowbased generative models parameterize probability distributions through an invertible transformation and can be trained by maximum likelihood. Invertible residual networks provide a flexible family of transformations where only Lipschitz conditions rather than strict architectural constraints are needed for enforcing invertibility. However, prior work trained invertible residual networks for density estimation by relying on biased logdensity estimates whose bias increased with the network's expressiveness. We give a tractable unbiased estima1te of the log density, and reduce the memory required during training by a factor of ten. Furthermore, we improve invertible residual blocks by proposing the use of activation functions that avoid gradient saturation and generalizing the Lipschitz condition to induced mixed norms. The resulting approach, called Residual Flows, achieves stateoftheart performance on density estimation amongst flowbased models, and outperforms networks that use coupling blocks at joint generative and discriminative modeling.
READ FULL TEXT