Augmented Normalizing Flows: Bridging the Gap Between Generative Flows and Latent Variable Models

02/17/2020
by   Chin-Wei Huang, et al.
0

In this work, we propose a new family of generative flows on an augmented data space, with an aim to improve expressivity without drastically increasing the computational cost of sampling and evaluation of a lower bound on the likelihood. Theoretically, we prove the proposed flow can approximate a Hamiltonian ODE as a universal transport map. Empirically, we demonstrate state-of-the-art performance on standard benchmarks of flow-based generative modeling.

READ FULL TEXT

page 7

page 8

page 13

page 20

page 21

research
05/13/2021

PassFlow: Guessing Passwords with Generative Flows

Recent advances in generative machine learning models rekindled research...
research
05/26/2021

Augmented KRnet for density estimation and approximation

In this work, we have proposed augmented KRnets including both discrete ...
research
07/12/2023

Embracing the chaos: analysis and diagnosis of numerical instability in variational flows

In this paper, we investigate the impact of numerical instability on the...
research
11/14/2020

Self Normalizing Flows

Efficient gradient computation of the Jacobian determinant term is a cor...
research
02/22/2020

VFlow: More Expressive Generative Flows with Variational Data Augmentation

Generative flows are promising tractable models for density modeling tha...
research
12/23/2015

Latent Variable Modeling with Diversity-Inducing Mutual Angular Regularization

Latent Variable Models (LVMs) are a large family of machine learning mod...
research
08/18/2021

Moser Flow: Divergence-based Generative Modeling on Manifolds

We are interested in learning generative models for complex geometries d...

Please sign up or login with your details

Forgot password? Click here to reset