Flows Succeed Where GANs Fail: Lessons from Low-Dimensional Data

06/17/2020
by   Tianci Liu, et al.
0

Normalizing flows and generative adversarial networks (GANs) are both approaches to density estimation that use deep neural networks to transform samples from an uninformative prior distribution to an approximation of the data distribution. There is great interest in both for general-purpose statistical modeling, but the two approaches have seldom been compared to each other for modeling non-image data. The difficulty of computing likelihoods with GANs, which are implicit models, makes conducting such a comparison challenging. We work around this difficulty by considering several low-dimensional synthetic datasets. An extensive grid search over GAN architectures, hyperparameters, and training procedures suggests that no GAN is capable of modeling our simple low-dimensional data well, a task we view as a prerequisite for an approach to be considered suitable for general-purpose statistical modeling. Several normalizing flows, on the other hand, excelled at these tasks, even substantially outperforming WGAN in terms of Wasserstein distance—the metric that WGAN alone targets. Overall, normalizing flows appear to be more reliable tools for statistical inference than GANs.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/19/2017

Summable Reparameterizations of Wasserstein Critics in the One-Dimensional Setting

Generative adversarial networks (GANs) are an exciting alternative to al...
research
05/27/2021

An error analysis of generative adversarial networks for learning distributions

This paper studies how well generative adversarial networks (GANs) learn...
research
12/05/2021

Generative Modeling of Turbulence

We present a mathematically well founded approach for the synthetic mode...
research
10/28/2022

Towards prediction of turbulent flows at high Reynolds numbers using high performance computing data and deep learning

In this paper, deep learning (DL) methods are evaluated in the context o...
research
04/15/2020

Effect of Input Noise Dimension in GANs

Generative Adversarial Networks (GANs) are by far the most successful ge...
research
07/07/2019

Copula & Marginal Flows: Disentangling the Marginal from its Joint

Deep generative networks such as GANs and normalizing flows flourish in ...
research
04/15/2020

On Box-Cox Transformation for Image Normality and Pattern Classification

A unique member of the power transformation family is known as the Box-C...

Please sign up or login with your details

Forgot password? Click here to reset