GDPP: Learning Diverse Generations Using Determinantal Point Process

11/30/2018
by   Mohamed Elfeki, et al.
0

Generative models have proven to be an outstanding tool for representing high-dimensional probability distributions and generating realistic looking images. A fundamental characteristic of generative models is their ability to produce multi-modal outputs. However, while training, they are often susceptible to mode collapse, which means that the model is limited in mapping the input noise to only a few modes of the true data distribution. In this paper, we draw inspiration from Determinantal Point Process (DPP) to devise a generative model that alleviates mode collapse while producing higher quality samples. DPP is an elegant probabilistic measure used to model negative correlations within a subset and hence quantify its diversity. We use DPP kernel to model the diversity in real data as well as in synthetic data. Then, we devise a generation penalty term that encourages the generator to synthesize data with a similar diversity to real data. In contrast to previous state-of-the-art generative models that tend to use additional trainable parameters or complex training paradigms, our method does not change the original training scheme. Embedded in an adversarial training and variational autoencoder, our Generative DPP approach shows a consistent resistance to mode-collapse on a wide-variety of synthetic data and natural image datasets including MNIST, CIFAR10, and CelebA, while outperforming state-of-the-art methods for data-efficiency, convergence-time, and generation quality. Our code is publicly available.

READ FULL TEXT

page 15

page 16

research
09/05/2021

VARGAN: Variance Enforcing Network Enhanced GAN

Generative adversarial networks (GANs) are one of the most widely used g...
research
04/25/2022

PhysioGAN: Training High Fidelity Generative Model for Physiological Sensor Readings

Generative models such as the variational autoencoder (VAE) and the gene...
research
04/24/2023

Towards Mode Balancing of Generative Models via Diversity Weights

Large data-driven image models are extensively used to support creative ...
research
05/22/2017

VEEGAN: Reducing Mode Collapse in GANs using Implicit Variational Learning

Deep generative models provide powerful tools for distributions over com...
research
11/03/2019

Digital phase-only holography using deep conditional generative models

Holographic wave-shaping has found numerous applications across the phys...
research
04/06/2021

Variational Transformer Networks for Layout Generation

Generative models able to synthesize layouts of different kinds (e.g. do...
research
10/04/2019

Conditional out-of-sample generation for unpaired data using trVAE

While generative models have shown great success in generating high-dime...

Please sign up or login with your details

Forgot password? Click here to reset