Understanding Entropic Regularization in GANs

11/02/2021
by   Daria Reshetova, et al.
0

Generative Adversarial Networks are a popular method for learning distributions from data by modeling the target distribution as a function of a known distribution. The function, often referred to as the generator, is optimized to minimize a chosen distance measure between the generated and target distributions. One commonly used measure for this purpose is the Wasserstein distance. However, Wasserstein distance is hard to compute and optimize, and in practice entropic regularization techniques are used to improve numerical convergence. The influence of regularization on the learned solution, however, remains not well-understood. In this paper, we study how several popular entropic regularizations of Wasserstein distance impact the solution in a simple benchmark setting where the generator is linear and the target distribution is high-dimensional Gaussian. We show that entropy regularization promotes the solution sparsification, while replacing the Wasserstein distance with the Sinkhorn divergence recovers the unregularized solution. Both regularization techniques remove the curse of dimensionality suffered by Wasserstein distance. We show that the optimal generator can be learned to accuracy ϵ with O(1/ϵ^2) samples from the target distribution. We thus conclude that these regularization techniques can improve the quality of the generator learned from empirical data for a large class of distributions.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

03/09/2020

When can Wasserstein GANs minimize Wasserstein Distance?

Generative Adversarial Networks (GANs) are widely used models to learn c...
03/29/2018

Generative Modeling using the Sliced Wasserstein Distance

Generative Adversarial Nets (GANs) are very successful at modeling distr...
12/19/2021

Wasserstein Generative Learning of Conditional Distribution

Conditional distribution is a fundamental quantity for describing the re...
12/17/2017

Wasserstein Distributional Robustness and Regularization in Statistical Learning

A central question in statistical learning is to design algorithms that ...
04/01/2019

Optimal Fusion of Elliptic Extended Target Estimates based on the Wasserstein Distance

This paper considers the fusion of multiple estimates of a spatially ext...
06/18/2018

Banach Wasserstein GAN

Wasserstein Generative Adversarial Networks (WGANs) can be used to gener...
12/16/2017

On reproduction of On the regularization of Wasserstein GANs

This report has several purposes. First, our report is written to invest...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.