Sliced Iterative Generator

07/01/2020
by   Biwei Dai, et al.
0

We introduce the Sliced Iterative Generator (SIG), an iterative generative model that is a Normalizing Flow (NF), but shares the advantages of Generative Adversarial Networks (GANs). The model is based on iterative Optimal Transport of a series of 1D slices through the data space, matching on each slice the probability distribution function (PDF) of the samples to the data. To improve the efficiency, the directions of the orthogonal slices are chosen to maximize the PDF difference between the generated samples and the data using Wasserstein distance at each iteration. A patch based approach is adopted to model the images in a hierarchical way, enabling the model to scale well to high dimensions. Unlike GANs, SIG has a NF structure and allows efficient likelihood evaluations that can be used in downstream tasks. We show that SIG is capable of generating realistic, high dimensional samples of images, achieving state of the art FID scores on MNIST and Fashion MNIST without any dimensionality reduction. It also has good Out of Distribution detection properties using the likelihood. To the best of our knowledge, SIG is the first iterative (greedy) deep learning algorithm that is competitive with the state of the art non-iterative generators in high dimensions. While SIG has a deep neural network architecture, the approach deviates significantly from the current deep learning paradigm, as it does not use concepts such as mini-batching, stochastic gradient descent, gradient back-propagation through deep layers, or non-convex loss function optimization. SIG is very insensitive to hyper-parameter tuning, making it a useful generator tool for ML experts and non-experts alike.

READ FULL TEXT

page 7

page 8

research
11/05/2019

Hierarchical Mixtures of Generators for Adversarial Learning

Generative adversarial networks (GANs) are deep neural networks that all...
research
04/12/2021

Understanding Overparameterization in Generative Adversarial Networks

A broad class of unsupervised deep learning methods such as Generative A...
research
10/27/2021

Training Wasserstein GANs without gradient penalties

We propose a stable method to train Wasserstein generative adversarial n...
research
01/24/2019

Maximum Entropy Generators for Energy-Based Models

Unsupervised learning is about capturing dependencies between variables ...
research
08/19/2019

PolyGAN: High-Order Polynomial Generators

Generative Adversarial Networks (GANs) have become the gold standard whe...
research
12/19/2014

On distinguishability criteria for estimating generative models

Two recently introduced criteria for estimation of generative models are...
research
07/21/2021

Boundary of Distribution Support Generator (BDSG): Sample Generation on the Boundary

Generative models, such as Generative Adversarial Networks (GANs), have ...

Please sign up or login with your details

Forgot password? Click here to reset