Statistical guarantees for generative models without domination

10/19/2020
by   Nicolas Schreuder, et al.
0

In this paper, we introduce a convenient framework for studying (adversarial) generative models from a statistical perspective. It consists in modeling the generative device as a smooth transformation of the unit hypercube of a dimension that is much smaller than that of the ambient space and measuring the quality of the generative model by means of an integral probability metric. In the particular case of integral probability metric defined through a smoothness class, we establish a risk bound quantifying the role of various parameters. In particular, it clearly shows the impact of dimension reduction on the error of the generative model.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/08/2021

On Linear Interpolation in the Latent Space of Deep Generative Models

The underlying geometrical structure of the latent space in deep generat...
research
10/09/2018

A Tale of Three Probabilistic Families: Discriminative, Descriptive and Generative Models

The pattern theory of Grenander is a mathematical framework where the pa...
research
10/06/2022

Content-Based Search for Deep Generative Models

The growing proliferation of pretrained generative models has made it in...
research
01/19/2021

Can smooth graphons in several dimensions be represented by smooth graphons on [0,1]?

A graphon that is defined on [0,1]^d and is Hölder(α) continuous for som...
research
10/13/2020

Generalized Rescaled Pólya urn and its statistical applications

We introduce the Generalized Rescaled Pólya (GRP) urn. In particular, th...
research
02/20/2018

Actively Avoiding Nonsense in Generative Models

A generative model may generate utter nonsense when it is fit to maximiz...
research
07/31/2023

Guaranteed Optimal Generative Modeling with Maximum Deviation from the Empirical Distribution

Generative modeling is a widely-used machine learning method with variou...

Please sign up or login with your details

Forgot password? Click here to reset