On the capacity of deep generative networks for approximating distributions

01/29/2021
by   Yunfei Yang, et al.
0

We study the efficacy and efficiency of deep generative networks for approximating probability distributions. We prove that neural networks can transform a one-dimensional source distribution to a distribution that is arbitrarily close to a high-dimensional target distribution in Wasserstein distances. Upper bounds of the approximation error are obtained in terms of neural networks' width and depth. It is shown that the approximation error grows at most linearly on the ambient dimension and that the approximation order only depends on the intrinsic dimension of the target distribution. On the contrary, when f-divergences are used as metrics of distributions, the approximation property is different. We prove that in order to approximate the target distribution in f-divergences, the dimension of the source distribution cannot be smaller than the intrinsic dimension of the target distribution. Therefore, f-divergences are less adequate than Waserstein distances as metrics of distributions for generating samples.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/25/2022

Learning Distributions by Generative Adversarial Networks: Approximation and Generalization

We study how well generative adversarial networks (GAN) learn probabilit...
research
04/19/2020

A Universal Approximation Theorem of Deep Neural Networks for Expressing Distributions

This paper studies the universal approximation property of deep neural n...
research
03/18/2021

Approximation for Probability Distributions by Wasserstein GAN

In this paper, we show that the approximation for distributions by Wasse...
research
07/26/2021

High-Dimensional Distribution Generation Through Deep Neural Networks

We show that every d-dimensional probability distribution of bounded sup...
research
06/30/2020

Constructive Universal High-Dimensional Distribution Generation through Deep ReLU Networks

We present an explicit deep neural network construction that transforms ...
research
05/04/2022

A Manifold Two-Sample Test Study: Integral Probability Metric with Neural Networks

Two-sample tests are important areas aiming to determine whether two col...
research
11/29/2021

Just Least Squares: Binary Compressive Sampling with Low Generative Intrinsic Dimension

In this paper, we consider recovering n dimensional signals from m binar...

Please sign up or login with your details

Forgot password? Click here to reset