Smoothing the Generative Latent Space with Mixup-based Distance Learning

11/23/2021
by   Chaerin Kong, et al.
27

Producing diverse and realistic images with generative models such as GANs typically requires large scale training with vast amount of images. GANs trained with extremely limited data can easily overfit to few training samples and display undesirable properties like "stairlike" latent space where transitions in latent space suffer from discontinuity, occasionally yielding abrupt changes in outputs. In this work, we consider the situation where neither large scale dataset of our interest nor transferable source dataset is available, and seek to train existing generative models with minimal overfitting and mode collapse. We propose latent mixup-based distance regularization on the feature space of both a generator and the counterpart discriminator that encourages the two players to reason not only about the scarce observed data points but the relative distances in the feature space they reside. Qualitative and quantitative evaluation on diverse datasets demonstrates that our method is generally applicable to existing models to enhance both fidelity and diversity under the constraint of limited data. Code will be made public.

READ FULL TEXT

page 5

page 6

page 12

page 13

page 14

page 15

page 16

page 17

research
10/28/2022

Latent Space is Feature Space: Regularization Term for GANs Training on Limited Dataset

Generative Adversarial Networks (GAN) is currently widely used as an uns...
research
01/11/2023

Large Scale Qualitative Evaluation of Generative Image Model Outputs

Evaluating generative image models remains a difficult problem. This is ...
research
02/09/2021

Using Deep LSD to build operators in GANs latent space with meaning in real space

Generative models rely on the key idea that data can be represented in t...
research
06/09/2021

Generative Models as a Data Source for Multiview Representation Learning

Generative models are now capable of producing highly realistic images t...
research
06/27/2023

Learning from Invalid Data: On Constraint Satisfaction in Generative Models

Generative models have demonstrated impressive results in vision, langua...
research
11/24/2018

Conditional Recurrent Flow: Conditional Generation of Longitudinal Samples with Applications to Neuroimaging

Generative models using neural network have opened a door to large-scale...
research
08/21/2023

Sampling From Autoencoders' Latent Space via Quantization And Probability Mass Function Concepts

In this study, we focus on sampling from the latent space of generative ...

Please sign up or login with your details

Forgot password? Click here to reset