Random Matrix Theory Proves that Deep Learning Representations of GAN-data Behave as Gaussian Mixtures

01/21/2020
by   Mohamed El Amine Seddik, et al.
0

This paper shows that deep learning (DL) representations of data produced by generative adversarial nets (GANs) are random vectors which fall within the class of so-called concentrated random vectors. Further exploiting the fact that Gram matrices, of the type G = X^T X with X=[x_1,...,x_n]∈R^p× n and x_i independent concentrated random vectors from a mixture model, behave asymptotically (as n,p→∞) as if the x_i were drawn from a Gaussian mixture, suggests that DL representations of GAN-data can be fully described by their first two statistical moments for a wide range of standard classifiers. Our theoretical findings are validated by generating images with the BigGAN model and across different popular deep representation networks.

READ FULL TEXT
research
11/18/2017

Deep Gaussian Mixture Models

Deep learning is a hierarchical inference method formed by subsequent mu...
research
04/08/2020

Mixture Density Conditional Generative Adversarial Network Models (MD-CGAN)

Generative Adversarial Networks (GANs) have gained significant attention...
research
06/18/2020

GAT-GMM: Generative Adversarial Training for Gaussian Mixture Models

Generative adversarial networks (GANs) learn the distribution of observe...
research
05/19/2018

BourGAN: Generative Networks with Metric Embeddings

This paper addresses the mode collapse for generative adversarial networ...
research
05/27/2019

Gaussian Approximations for Maxima of Random Vectors under (2+ι)-th Moments

We derive a Gaussian approximation result for the maximum of a sum of ra...
research
07/02/2022

On the modern deep learning approaches for precipitation downscaling

Deep Learning (DL) based downscaling has become a popular tool in earth ...

Please sign up or login with your details

Forgot password? Click here to reset