DeepAI
Log In Sign Up

Partition-Guided GANs

04/02/2021
by   Mohammadreza Armandpour, et al.
0

Despite the success of Generative Adversarial Networks (GANs), their training suffers from several well-known problems, including mode collapse and difficulties learning a disconnected set of manifolds. In this paper, we break down the challenging task of learning complex high dimensional distributions, supporting diverse data samples, to simpler sub-tasks. Our solution relies on designing a partitioner that breaks the space into smaller regions, each having a simpler distribution, and training a different generator for each partition. This is done in an unsupervised manner without requiring any labels. We formulate two desired criteria for the space partitioner that aid the training of our mixture of generators: 1) to produce connected partitions and 2) provide a proxy of distance between partitions and data samples, along with a direction for reducing that distance. These criteria are developed to avoid producing samples from places with non-existent data density, and also facilitate training by providing additional direction to the generators. We develop theoretical constraints for a space partitioner to satisfy the above criteria. Guided by our theoretical analysis, we design an effective neural architecture for the space partitioner that empirically assures these conditions. Experimental results on various standard benchmarks show that the proposed unsupervised model outperforms several recent methods.

READ FULL TEXT

page 1

page 15

page 16

page 19

07/13/2020

Lessons Learned from the Training of GANs on Artificial Datasets

Generative Adversarial Networks (GANs) have made great progress in synth...
06/03/2018

Disconnected Manifold Learning for Generative Adversarial Networks

Real images often lie on a union of disjoint manifolds rather than one g...
12/12/2017

PacGAN: The power of two samples in generative adversarial networks

Generative adversarial networks (GANs) are innovative techniques for lea...
11/25/2020

Multiclass non-Adversarial Image Synthesis, with Application to Classification from Very Small Sample

The generation of synthetic images is currently being dominated by Gener...
11/24/2020

A Convenient Infinite Dimensional Framework for Generative Adversarial Learning

In recent years, generative adversarial networks (GANs) have demonstrate...
01/24/2019

Maximum Entropy Generators for Energy-Based Models

Unsupervised learning is about capturing dependencies between variables ...
02/13/2019

Rethinking Generative Coverage: A Pointwise Guaranteed Approac

All generative models have to combat missing modes. The conventional wis...