DeepAI AI Chat
Log In Sign Up

Universal Approximation of Residual Flows in Maximum Mean Discrepancy

03/10/2021
by   Zhifeng Kong, et al.
0

Normalizing flows are a class of flexible deep generative models that offer easy likelihood computation. Despite their empirical success, there is little theoretical understanding of their expressiveness. In this work, we study residual flows, a class of normalizing flows composed of Lipschitz residual blocks. We prove residual flows are universal approximators in maximum mean discrepancy. We provide upper bounds on the number of residual blocks to achieve approximation under different assumptions.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/06/2019

Residual Flows for Invertible Generative Modeling

Flow-based generative models parameterize probability distributions thro...
05/31/2020

The Expressive Power of a Class of Normalizing Flow Models

Normalizing flows have received a great deal of recent attention as they...
03/17/2021

Implicit Normalizing Flows

Normalizing flows define a probability distribution by an explicit inver...
02/06/2021

Robust normalizing flows using Bernstein-type polynomials

Normalizing flows (NFs) are a class of generative models that allows exa...
10/05/2020

i-DenseNets

We introduce Invertible Dense Networks (i-DenseNets), a more parameter e...
10/16/2021

Equivariant Discrete Normalizing Flows

At its core, generative modeling seeks to uncover the underlying factors...
07/07/2019

Copula & Marginal Flows: Disentangling the Marginal from its Joint

Deep generative networks such as GANs and normalizing flows flourish in ...