Can Push-forward Generative Models Fit Multimodal Distributions?

06/29/2022
by   Antoine Salmona, et al.
0

Many generative models synthesize data by transforming a standard Gaussian random variable using a deterministic neural network. Among these models are the Variational Autoencoders and the Generative Adversarial Networks. In this work, we call them "push-forward" models and study their expressivity. We show that the Lipschitz constant of these generative networks has to be large in order to fit multimodal distributions. More precisely, we show that the total variation distance and the Kullback-Leibler divergence between the generated and the data distribution are bounded from below by a constant depending on the mode separation and the Lipschitz constant. Since constraining the Lipschitz constants of neural networks is a common way to stabilize generative models, there is a provable trade-off between the ability of push-forward models to approximate multimodal distributions and the stability of their training. We validate our findings on one-dimensional and image datasets and empirically show that generative models consisting of stacked networks with stochastic input at each step, such as diffusion models do not suffer of such limitations.

READ FULL TEXT

page 8

page 23

research
07/06/2021

Provable Lipschitz Certification for Generative Models

We present a scalable technique for upper bounding the Lipschitz constan...
research
06/02/2016

f-GAN: Training Generative Neural Samplers using Variational Divergence Minimization

Generative neural samplers are probabilistic models that implement sampl...
research
04/04/2020

Theoretical Insights into the Use of Structural Similarity Index In Generative Models and Inferential Autoencoders

Generative models and inferential autoencoders mostly make use of ℓ_2 no...
research
06/22/2022

A Study on the Evaluation of Generative Models

Implicit generative models, which do not return likelihood values, such ...
research
05/14/2019

Learning Generative Models across Incomparable Spaces

Generative Adversarial Networks have shown remarkable success in learnin...
research
04/30/2018

Clustering Meets Implicit Generative Models

Clustering is a cornerstone of unsupervised learning which can be though...
research
06/11/2023

On Kinetic Optimal Probability Paths for Generative Models

Recent successful generative models are trained by fitting a neural netw...

Please sign up or login with your details

Forgot password? Click here to reset