
Constructive Universal HighDimensional Distribution Generation through Deep ReLU Networks
We present an explicit deep neural network construction that transforms ...
read it

Approximating Probability Distributions by ReLU Networks
How many neurons are needed to approximate a target probability distribu...
read it

SizeNoise Tradeoffs in Generative Networks
This paper investigates the ability of generative networks to convert th...
read it

On the capacity of deep generative networks for approximating distributions
We study the efficacy and efficiency of deep generative networks for app...
read it

Generating Neural Networks with Neural Networks
Hypernetworks are neural networks that transform a random input vector i...
read it

A Universal Approximation Theorem of Deep Neural Networks for Expressing Distributions
This paper studies the universal approximation property of deep neural n...
read it

Approximation for Probability Distributions by Wasserstein GAN
In this paper, we show that the approximation for distributions by Wasse...
read it
HighDimensional Distribution Generation Through Deep Neural Networks
We show that every ddimensional probability distribution of bounded support can be generated through deep ReLU networks out of a 1dimensional uniform input distribution. What is more, this is possible without incurring a cost  in terms of approximation error measured in Wassersteindistance  relative to generating the ddimensional target distribution from d independent random variables. This is enabled by a vast generalization of the spacefilling approach discovered in (Bailey Telgarsky, 2018). The construction we propose elicits the importance of network depth in driving the Wasserstein distance between the target distribution and its neural network approximation to zero. Finally, we find that, for histogram target distributions, the number of bits needed to encode the corresponding generative network equals the fundamental limit for encoding probability distributions as dictated by quantization theory.
READ FULL TEXT
Comments
There are no comments yet.