CcGAN: Continuous Conditional Generative Adversarial Networks for Image Generation

11/15/2020
by   Xin Ding, et al.
5

This work proposes the continuous conditional generative adversarial network (CcGAN), the first generative model for image generation conditional on continuous, scalar conditions (termed regression labels). Existing conditional GANs (cGANs) are mainly designed for categorical conditions (e.g., class labels); conditioning on regression labels is mathematically distinct and raises two fundamental problems: (P1) Since there may be very few (even zero) real images for some regression labels, minimizing existing empirical versions of cGAN losses (a.k.a. empirical cGAN losses) often fails in practice; (P2) Since regression labels are scalar and infinitely many, conventional label input methods are not applicable. The proposed CcGAN solves the above problems, respectively, by (S1) reformulating existing empirical cGAN losses to be appropriate for the continuous scenario; and (S2) proposing a naive label input (NLI) method and an improved label input (ILI) method to incorporate regression labels into the generator and the discriminator. The reformulation in (S1) leads to two novel empirical discriminator losses, termed the hard vicinal discriminator loss (HVDL) and the soft vicinal discriminator loss (SVDL) respectively, and a novel empirical generator loss. The error bounds of a discriminator trained with HVDL and SVDL are derived under mild assumptions in this work. Two new benchmark datasets (RC-49 and Cell-200) and a novel evaluation metric (Sliding Fréchet Inception Distance) are also proposed for this continuous scenario. Our experiments on the Circular 2-D Gaussians, RC-49, UTKFace, Cell-200, and Steering Angle datasets show that CcGAN can generate diverse, high-quality samples from the image distribution conditional on a given regression label. Moreover, in these experiments, CcGAN substantially outperforms cGAN both visually and quantitatively.

READ FULL TEXT

page 10

page 11

page 13

page 15

page 28

page 30

page 32

page 34

research
08/20/2023

Turning Waste into Wealth: Leveraging Low-Quality Samples for Enhancing Continuous Conditional Generative Adversarial Networks

Continuous Conditional Generative Adversarial Networks (CcGANs) enable g...
research
01/21/2018

Decoupled Learning for Conditional Adversarial Networks

Incorporating encoding-decoding nets with adversarial nets has been wide...
research
08/20/2021

Dual Projection Generative Adversarial Networks for Conditional Image Generation

Conditional Generative Adversarial Networks (cGANs) extend the standard ...
research
06/28/2021

Are conditional GANs explicitly conditional?

This paper proposes two important contributions for conditional Generati...
research
03/20/2021

Efficient Subsampling for Generating High-Quality Images from Conditional Generative Adversarial Networks

Subsampling unconditional generative adversarial networks (GANs) to impr...
research
11/12/2018

Adversarial Learning of Label Dependency: A Novel Framework for Multi-class Classification

Recent work has shown that exploiting relations between labels improves ...
research
05/28/2019

JGAN: A Joint Formulation of GAN for Synthesizing Images and Labels

Image generation with explicit condition or label generally works better...

Please sign up or login with your details

Forgot password? Click here to reset