Measure-conditional Discriminator with Stationary Optimum for GANs and Statistical Distance Surrogates

01/17/2021
by   Liu Yang, et al.
5

We propose a simple but effective modification of the discriminators, namely measure-conditional discriminators, as a plug-and-play module for different GANs. By taking the generated distributions as part of input so that the target optimum for the discriminator is stationary, the proposed discriminator is more robust than the vanilla one. A variant of the measure-conditional discriminator can also handle multiple target distributions, or act as a surrogate model of statistical distances such as KL divergence with applications to transfer learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/19/2019

A Simple yet Effective Way for Improving the Performance of GANs

This paper presents a simple but effective way that improves the perform...
research
09/18/2020

Conditional Image Generation with One-Vs-All Classifier

This paper explores conditional image generation with a One-Vs-All class...
research
02/02/2022

Structure-preserving GANs

Generative adversarial networks (GANs), a class of distribution-learning...
research
02/15/2018

cGANs with Projection Discriminator

We propose a novel, projection based way to incorporate the conditional ...
research
02/25/2020

Freeze Discriminator: A Simple Baseline for Fine-tuning GANs

Generative adversarial networks (GANs) have shown outstanding performanc...
research
09/29/2021

Reliable Estimation of KL Divergence using a Discriminator in Reproducing Kernel Hilbert Space

Estimating Kullback Leibler (KL) divergence from samples of two distribu...
research
02/25/2020

Reliable Estimation of Kullback-Leibler Divergence by Controlling Discriminator Complexity in the Reproducing Kernel Hilbert Space

Several scalable methods to compute the Kullback Leibler (KL) divergence...

Please sign up or login with your details

Forgot password? Click here to reset