The Expressive Power of a Class of Normalizing Flow Models

05/31/2020
by   Zhifeng Kong, et al.
0

Normalizing flows have received a great deal of recent attention as they allow flexible generative modeling as well as easy likelihood computation. While a wide variety of flow models have been proposed, there is little formal understanding of the representation power of these models. In this work, we study some basic normalizing flows and rigorously establish bounds on their expressive power. Our results indicate that while these flows are highly expressive in one dimension, in higher dimensions their representation power may be limited, especially when the flows have moderate depth.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/22/2020

VFlow: More Expressive Generative Flows with Variational Data Augmentation

Generative flows are promising tractable models for density modeling tha...
research
02/06/2020

Normalizing Flows on Tori and Spheres

Normalizing flows are a powerful tool for building expressive distributi...
research
12/05/2019

Normalizing Flows for Probabilistic Modeling and Inference

Normalizing flows provide a general mechanism for defining expressive pr...
research
06/27/2023

Deep Normalizing Flows for State Estimation

Safe and reliable state estimation techniques are a critical component o...
research
03/10/2021

Universal Approximation of Residual Flows in Maximum Mean Discrepancy

Normalizing flows are a class of flexible deep generative models that of...
research
03/03/2022

Generative Modeling for Low Dimensional Speech Attributes with Neural Spline Flows

Despite recent advances in generative modeling for text-to-speech synthe...
research
09/24/2021

Attentive Contractive Flow: Improved Contractive Flows with Lipschitz-constrained Self-Attention

Normalizing flows provide an elegant method for obtaining tractable dens...

Please sign up or login with your details

Forgot password? Click here to reset