Towards GANs' Approximation Ability

04/10/2020
by   Xuejiao Liu, et al.
8

Generative adversarial networks (GANs) have attracted intense interest in the field of generative models. However, few investigations focusing either on the theoretical analysis or on algorithm design for the approximation ability of the generator of GANs have been reported. This paper will first theoretically analyze GANs' approximation property. Similar to the universal approximation property of the full connected neural networks with one hidden layer, we prove that the generator with the input latent variable in GANs can universally approximate the potential data distribution given the increasing hidden neurons. Furthermore, we propose an approach named stochastic data generation (SDG) to enhance GANs' approximation ability. Our approach is based on the simple idea of imposing randomness through data generation in GANs by a prior distribution on the conditional probability between the layers. Our approach can be easily implemented by using the reparameterization trick. The experimental results on synthetic dataset verify the improved approximation ability obtained by this SDG approach. In the practical dataset, the NSGAN/WGANGP with SDG can also outperform traditional GANs with little change in the model architectures.

READ FULL TEXT
research
06/26/2023

Procedural content generation of puzzle games using conditional generative adversarial networks

In this article, we present an experimental approach to using parameteri...
research
08/19/2019

PolyGAN: High-Order Polynomial Generators

Generative Adversarial Networks (GANs) have become the gold standard whe...
research
05/17/2019

Biosignal Generation and Latent Variable Analysis with Recurrent Generative Adversarial Networks

The effectiveness of biosignal generation and data augmentation with bio...
research
07/03/2017

Learning to Avoid Errors in GANs by Manipulating Input Spaces

Despite recent advances, large scale visual artifacts are still a common...
research
12/15/2021

Probabilistic Forecasting with Conditional Generative Networks via Scoring Rule Minimization

Probabilistic forecasting consists of stating a probability distribution...
research
05/26/2019

OOGAN: Disentangling GAN with One-Hot Sampling and Orthogonal Regularization

Exploring the potential of GANs for unsupervised disentanglement learnin...
research
12/03/2016

Ensembles of Generative Adversarial Networks

Ensembles are a popular way to improve results of discriminative CNNs. T...

Please sign up or login with your details

Forgot password? Click here to reset