
Mixture Density Conditional Generative Adversarial Network Models (MDCGAN)
Generative Adversarial Networks (GANs) have gained significant attention...
read it

GATGMM: Generative Adversarial Training for Gaussian Mixture Models
Generative adversarial networks (GANs) learn the distribution of observe...
read it

BourGAN: Generative Networks with Metric Embeddings
This paper addresses the mode collapse for generative adversarial networ...
read it

Mixture Density Generative Adversarial Networks
Generative Adversarial Networks have surprising ability for generating s...
read it

Gaussian Approximations for Maxima of Random Vectors under (2+ι)th Moments
We derive a Gaussian approximation result for the maximum of a sum of ra...
read it

A Bayesian Method for Estimating Uncertainty in Excavated Material
This paper proposes a method to probabilistically quantify the moments (...
read it

A Large Dimensional Analysis of Regularized Discriminant Analysis Classifiers
This article carries out a large dimensional analysis of standard regula...
read it
Random Matrix Theory Proves that Deep Learning Representations of GANdata Behave as Gaussian Mixtures
This paper shows that deep learning (DL) representations of data produced by generative adversarial nets (GANs) are random vectors which fall within the class of socalled concentrated random vectors. Further exploiting the fact that Gram matrices, of the type G = X^T X with X=[x_1,...,x_n]∈R^p× n and x_i independent concentrated random vectors from a mixture model, behave asymptotically (as n,p→∞) as if the x_i were drawn from a Gaussian mixture, suggests that DL representations of GANdata can be fully described by their first two statistical moments for a wide range of standard classifiers. Our theoretical findings are validated by generating images with the BigGAN model and across different popular deep representation networks.
READ FULL TEXT
Comments
There are no comments yet.