Universal Approximation for Log-concave Distributions using Well-conditioned Normalizing Flows

07/07/2021
by   Holden Lee, et al.
0

Normalizing flows are a widely used class of latent-variable generative models with a tractable likelihood. Affine-coupling (Dinh et al, 2014-16) models are a particularly common type of normalizing flows, for which the Jacobian of the latent-to-observable-variable transformation is triangular, allowing the likelihood to be computed in linear time. Despite the widespread usage of affine couplings, the special structure of the architecture makes understanding their representational power challenging. The question of universal approximation was only recently resolved by three parallel papers (Huang et al.,2020;Zhang et al.,2020;Koehler et al.,2020) – who showed reasonably regular distributions can be approximated arbitrarily well using affine couplings – albeit with networks with a nearly-singular Jacobian. As ill-conditioned Jacobians are an obstacle for likelihood-based training, the fundamental question remains: which distributions can be approximated using well-conditioned affine coupling flows? In this paper, we show that any log-concave distribution can be approximated using well-conditioned affine-coupling flows. In terms of proof techniques, we uncover and leverage deep connections between affine coupling architectures, underdamped Langevin dynamics (a stochastic differential equation often used to sample from Gibbs measures) and Hénon maps (a structured dynamical system that appears in the study of symplectic diffeomorphisms). Our results also inform the practice of training affine couplings: we approximate a padded version of the input distribution with iid Gaussians – a strategy which Koehler et al.(2020) empirically observed to result in better-conditioned flows, but had hitherto no theoretical grounding. Our proof can thus be seen as providing theoretical evidence for the benefits of Gaussian padding when training normalizing flows.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/08/2021

Universal Joint Approximation of Manifolds and Densities by Simple Injective Flows

We analyze neural networks composed of bijective flows and injective exp...
research
10/02/2020

Representational aspects of depth and conditioning in normalizing flows

Normalizing flows are among the most popular paradigms in generative mod...
research
06/20/2020

Coupling-based Invertible Neural Networks Are Universal Diffeomorphism Approximators

Invertible neural networks based on coupling flows (CF-INNs) have variou...
research
07/19/2020

Generative Flows with Matrix Exponential

Generative flows models enjoy the properties of tractable exact likeliho...
research
03/03/2022

Generative Modeling for Low Dimensional Speech Attributes with Neural Spline Flows

Despite recent advances in generative modeling for text-to-speech synthe...
research
02/07/2022

Universality of parametric Coupling Flows over parametric diffeomorphisms

Invertible neural networks based on Coupling Flows CFlows) have various ...
research
05/04/2023

Piecewise Normalizing Flows

Normalizing flows are an established approach for modelling complex prob...

Please sign up or login with your details

Forgot password? Click here to reset