Training Generative Adversarial Networks Via Turing Test

10/25/2018
by   Jianlin Su, et al.
2

In this article, we introduce a new mode for training Generative Adversarial Networks (GANs). Rather than minimizing the distance of evidence distribution p̃(x) and the generative distribution q(x), we minimize the distance of p̃(x_r)q(x_f) and p̃(x_f)q(x_r). This adversarial pattern can be interpreted as a Turing test in GANs. It allows us to use information of real samples during training generator and accelerates the whole training procedure. We even find that just proportionally increasing the size of discriminator and generator, it succeeds on 256x256 resolution without adjusting hyperparameters carefully.

READ FULL TEXT

page 6

page 8

page 9

page 10

research
11/07/2016

Unrolled Generative Adversarial Networks

We introduce a method to stabilize Generative Adversarial Networks (GANs...
research
06/28/2019

SetGANs: Enforcing Distributional Accuracy in Generative Adversarial Networks

This paper addresses the ability of generative adversarial networks (GAN...
research
06/16/2020

The Bures Metric for Taming Mode Collapse in Generative Adversarial Networks

Generative Adversarial Networks (GANs) are performant generative methods...
research
11/22/2020

Generative Adversarial Stacked Autoencoders

Generative Adversarial Networks (GANs) have become predominant in image ...
research
01/23/2019

Learning to navigate image manifolds induced by generative adversarial networks for unsupervised video generation

In this work, we introduce a two-step framework for generative modeling ...
research
08/04/2022

Visually Evaluating Generative Adversarial Networks Using Itself under Multivariate Time Series

Visually evaluating the goodness of generated Multivariate Time Series (...
research
06/28/2019

Cellular State Transformations using Generative Adversarial Networks

We introduce a novel method to unite deep learning with biology by which...

Please sign up or login with your details

Forgot password? Click here to reset