No Modes left behind: Capturing the data distribution effectively using GANs

02/02/2018
by   Shashank Sharma, et al.
0

Generative adversarial networks (GANs) while being very versatile in realistic image synthesis, still are sensitive to the input distribution. Given a set of data that has an imbalance in the distribution, the networks are susceptible to missing modes and not capturing the data distribution. While various methods have been tried to improve training of GANs, these have not addressed the challenges of covering the full data distribution. Specifically, a generator is not penalized for missing a mode. We show that these are therefore still susceptible to not capturing the full data distribution. In this paper, we propose a simple approach that combines an encoder based objective with novel loss functions for generator and discriminator that improves the solution in terms of capturing missing modes. We validate that the proposed method results in substantial improvements through its detailed analysis on toy and real datasets. The quantitative and qualitative results demonstrate that the proposed method improves the solution for the problem of missing modes and improves training of GANs.

READ FULL TEXT

page 7

page 8

page 10

page 11

page 12

page 13

page 14

page 15

research
06/30/2020

PriorGAN: Real Data Prior for Generative Adversarial Nets

Generative adversarial networks (GANs) have achieved rapid progress in l...
research
02/25/2019

MisGAN: Learning from Incomplete Data with Generative Adversarial Networks

Generative adversarial networks (GANs) have been shown to provide an eff...
research
02/12/2018

Tempered Adversarial Networks

Generative adversarial networks (GANs) have been shown to produce realis...
research
10/21/2019

Mining GOLD Samples for Conditional GANs

Conditional generative adversarial networks (cGANs) have gained a consid...
research
12/07/2016

Mode Regularized Generative Adversarial Networks

Although Generative Adversarial Networks achieve state-of-the-art result...
research
05/05/2022

GANs as Gradient Flows that Converge

This paper approaches the unsupervised learning problem by gradient desc...
research
01/12/2023

GH-Feat: Learning Versatile Generative Hierarchical Features from GANs

Recent years witness the tremendous success of generative adversarial ne...

Please sign up or login with your details

Forgot password? Click here to reset