Size-Noise Tradeoffs in Generative Networks

10/26/2018
by   Bolton Bailey, et al.
0

This paper investigates the ability of generative networks to convert their input noise distributions into other distributions. Firstly, we demonstrate a construction that allows ReLU networks to increase the dimensionality of their noise distribution by implementing a "space-filling" function based on iterated tent maps. We show this construction is optimal by analyzing the number of affine pieces in functions computed by multivariate ReLU networks. Secondly, we provide efficient ways (using polylog (1/ϵ) nodes) for networks to pass between univariate uniform and normal distributions, using a Taylor series approximation and a binary search gadget for computing function inverses. Lastly, we indicate how high dimensional distributions can be efficiently transformed into low dimensional distributions.

READ FULL TEXT
research
07/26/2021

High-Dimensional Distribution Generation Through Deep Neural Networks

We show that every d-dimensional probability distribution of bounded sup...
research
06/30/2020

Constructive Universal High-Dimensional Distribution Generation through Deep ReLU Networks

We present an explicit deep neural network construction that transforms ...
research
03/20/2022

How do noise tails impact on deep ReLU networks?

This paper investigates the stability of deep ReLU neural networks for n...
research
02/28/2019

A lattice-based approach to the expressivity of deep ReLU neural networks

We present new families of continuous piecewise linear (CPWL) functions ...
research
08/06/2020

ReLU nets adapt to intrinsic dimensionality beyond the target domain

We study the approximation of two-layer compositions f(x) = g(ϕ(x)) via ...
research
12/25/2017

Space-Filling Designs for Robustness Experiments

To identify the robust settings of the control factors, it is very impor...

Please sign up or login with your details

Forgot password? Click here to reset