Manifold Topology Divergence: a Framework for Comparing Data Manifolds

06/08/2021
by   Serguei Barannikov, et al.
0

We develop a framework for comparing data manifolds, aimed, in particular, towards the evaluation of deep generative models. We describe a novel tool, Cross-Barcode(P,Q), that, given a pair of distributions in a high-dimensional space, tracks multiscale topology spacial discrepancies between manifolds on which the distributions are concentrated. Based on the Cross-Barcode, we introduce the Manifold Topology Divergence score (MTop-Divergence) and apply it to assess the performance of deep generative models in various domains: images, 3D-shapes, time-series, and on different datasets: MNIST, Fashion MNIST, SVHN, CIFAR10, FFHQ, chest X-ray images, market stock data, ShapeNet. We demonstrate that the MTop-Divergence accurately detects various degrees of mode-dropping, intra-mode collapse, mode invention, and image disturbance. Our algorithm scales well (essentially linearly) with the increase of the dimension of the ambient high-dimensional space. It is one of the first TDA-based practical methodologies that can be applied universally to datasets of different sizes and dimensions, including the ones on which the most recent GANs in the visual domain are trained. The proposed method is domain agnostic and does not rely on pre-trained networks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/21/2017

The Riemannian Geometry of Deep Generative Models

Deep generative models learn a mapping from a low dimensional latent spa...
research
05/27/2019

Intrinsic Multi-scale Evaluation of Generative Models

Generative models are often used to sample high-dimensional data points ...
research
05/09/2021

A likelihood approach to nonparametric estimation of a singular distribution using deep generative models

We investigate statistical properties of a likelihood approach to nonpar...
research
04/14/2022

Diagnosing and Fixing Manifold Overfitting in Deep Generative Models

Likelihood-based, or explicit, deep generative models use neural network...
research
05/31/2023

End-to-end Training of Deep Boltzmann Machines by Unbiased Contrastive Divergence with Local Mode Initialization

We address the problem of biased gradient estimation in deep Boltzmann m...
research
11/21/2018

Spread Divergences

For distributions p and q with different support, the divergence general...
research
06/15/2021

Divergence Frontiers for Generative Models: Sample Complexity, Quantization Level, and Frontier Integral

The spectacular success of deep generative models calls for quantitative...

Please sign up or login with your details

Forgot password? Click here to reset