Intrinsic Multi-scale Evaluation of Generative Models

by   Anton Tsitsulin, et al.

Generative models are often used to sample high-dimensional data points from a manifold with small intrinsic dimension. Existing techniques for comparing generative models focus on global data properties such as mean and covariance; in that sense, they are extrinsic and uni-scale. We develop the first, to our knowledge, intrinsic and multi-scale method for characterizing and comparing underlying data manifolds, based on comparing all data moments by lower-bounding the spectral notion of the Gromov-Wasserstein distance between manifolds. In a thorough experimental study, we demonstrate that our method effectively evaluates the quality of generative models; further, we showcase its efficacy in discerning the disentanglement process in neural networks.


The Riemannian Geometry of Deep Generative Models

Deep generative models learn a mapping from a low dimensional latent spa...

Heuristic Framework for Multi-Scale Testing of the Multi-Manifold Hypothesis

When analyzing empirical data, we often find that global linear models o...

Manifold Topology Divergence: a Framework for Comparing Data Manifolds

We develop a framework for comparing data manifolds, aimed, in particula...

Learning Implicit Generative Models with the Method of Learned Moments

We propose a method of moments (MoM) algorithm for training large-scale ...

Learning Generative Models across Incomparable Spaces

Generative Adversarial Networks have shown remarkable success in learnin...

Curriculum Learning for Deep Generative Models with Clustering

Training generative models like generative adversarial networks (GANs) a...

Generative Models as Distributions of Functions

Generative models are typically trained on grid-like data such as images...

Code Repositories


Code for MSID, a Multi-Scale Intrinsic Distance for comparing generative models, studying neural networks, and more!

view repo