DeepAI AI Chat
Log In Sign Up

Manifold Topology Divergence: a Framework for Comparing Data Manifolds

by   Serguei Barannikov, et al.

We develop a framework for comparing data manifolds, aimed, in particular, towards the evaluation of deep generative models. We describe a novel tool, Cross-Barcode(P,Q), that, given a pair of distributions in a high-dimensional space, tracks multiscale topology spacial discrepancies between manifolds on which the distributions are concentrated. Based on the Cross-Barcode, we introduce the Manifold Topology Divergence score (MTop-Divergence) and apply it to assess the performance of deep generative models in various domains: images, 3D-shapes, time-series, and on different datasets: MNIST, Fashion MNIST, SVHN, CIFAR10, FFHQ, chest X-ray images, market stock data, ShapeNet. We demonstrate that the MTop-Divergence accurately detects various degrees of mode-dropping, intra-mode collapse, mode invention, and image disturbance. Our algorithm scales well (essentially linearly) with the increase of the dimension of the ambient high-dimensional space. It is one of the first TDA-based practical methodologies that can be applied universally to datasets of different sizes and dimensions, including the ones on which the most recent GANs in the visual domain are trained. The proposed method is domain agnostic and does not rely on pre-trained networks.


page 1

page 2

page 3

page 4


The Riemannian Geometry of Deep Generative Models

Deep generative models learn a mapping from a low dimensional latent spa...

Intrinsic Multi-scale Evaluation of Generative Models

Generative models are often used to sample high-dimensional data points ...

A likelihood approach to nonparametric estimation of a singular distribution using deep generative models

We investigate statistical properties of a likelihood approach to nonpar...

Diagnosing and Fixing Manifold Overfitting in Deep Generative Models

Likelihood-based, or explicit, deep generative models use neural network...

The Union of Manifolds Hypothesis and its Implications for Deep Generative Modelling

Deep learning has had tremendous success at learning low-dimensional rep...

Spread Divergences

For distributions p and q with different support, the divergence general...

DOI: Divergence-based Out-of-Distribution Indicators via Deep Generative Models

To ensure robust and reliable classification results, OoD (out-of-distri...