Universal Approximation of Residual Flows in Maximum Mean Discrepancy

03/10/2021
by   Zhifeng Kong, et al.
0

Normalizing flows are a class of flexible deep generative models that offer easy likelihood computation. Despite their empirical success, there is little theoretical understanding of their expressiveness. In this work, we study residual flows, a class of normalizing flows composed of Lipschitz residual blocks. We prove residual flows are universal approximators in maximum mean discrepancy. We provide upper bounds on the number of residual blocks to achieve approximation under different assumptions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2019

Residual Flows for Invertible Generative Modeling

Flow-based generative models parameterize probability distributions thro...
research
05/31/2020

The Expressive Power of a Class of Normalizing Flow Models

Normalizing flows have received a great deal of recent attention as they...
research
05/19/2023

Generative Sliced MMD Flows with Riesz Kernels

Maximum mean discrepancy (MMD) flows suffer from high computational cost...
research
03/17/2021

Implicit Normalizing Flows

Normalizing flows define a probability distribution by an explicit inver...
research
02/06/2021

Robust normalizing flows using Bernstein-type polynomials

Normalizing flows (NFs) are a class of generative models that allows exa...
research
10/05/2020

i-DenseNets

We introduce Invertible Dense Networks (i-DenseNets), a more parameter e...
research
10/16/2021

Equivariant Discrete Normalizing Flows

At its core, generative modeling seeks to uncover the underlying factors...

Please sign up or login with your details

Forgot password? Click here to reset