Cumulant GAN

06/11/2020
by   Yannis Pantazis, et al.
0

Despite the continuous improvements of Generative Adversarial Networks (GANs), stability and performance challenges still remain. In this work, we propose a novel loss function for GAN training aiming both for deeper theoretical understanding and improved performance of the underlying optimization problem. The new loss function is based on cumulant generating functions and relies on a recently-derived variational formula. We show that the corresponding optimization is equivalent to Rényi divergence minimization, thus offering a (partially) unified perspective of GAN losses: the Rényi family encompasses Kullback-Leibler divergence (KLD), reverse KLD, Hellinger distance and χ^2-divergence. Wasserstein loss function is also included in the proposed cumulant GAN formulation. In terms of stability, we rigorously prove the convergence of the gradient descent algorithm for linear generator and linear discriminator for Gaussian distributions. Moreover, we numerically show that synthetic image generation trained on CIFAR-10 dataset is substantially improved in terms of inception score when weaker discriminators are considered.

READ FULL TEXT

page 9

page 18

page 19

page 20

page 21

page 22

page 23

page 24

research
06/03/2020

Rényi Generative Adversarial Networks

We propose a loss function for generative adversarial networks (GANs) us...
research
11/11/2020

(f,Γ)-Divergences: Interpolating between f-Divergences and Integral Probability Metrics

We develop a general framework for constructing new information-theoreti...
research
05/12/2022

α-GAN: Convergence and Estimation Guarantees

We prove a two-way correspondence between the min-max optimization of ge...
research
02/13/2018

First Order Generative Adversarial Networks

GANs excel at learning high dimensional distributions, but they can upda...
research
10/20/2019

Learning GANs and Ensembles Using Discrepancy

Generative adversarial networks (GANs) generate data based on minimizing...
research
11/06/2017

KGAN: How to Break The Minimax Game in GAN

Generative Adversarial Networks (GANs) were intuitively and attractively...
research
01/26/2017

Wasserstein GAN

We introduce a new algorithm named WGAN, an alternative to traditional G...

Please sign up or login with your details

Forgot password? Click here to reset