A Manifold Two-Sample Test Study: Integral Probability Metric with Neural Networks

by   Jie Wang, et al.

Two-sample tests are important areas aiming to determine whether two collections of observations follow the same distribution or not. We propose two-sample tests based on integral probability metric (IPM) for high-dimensional samples supported on a low-dimensional manifold. We characterize the properties of proposed tests with respect to the number of samples n and the structure of the manifold with intrinsic dimension d. When an atlas is given, we propose two-step test to identify the difference between general distributions, which achieves the type-II risk in the order of n^-1/max{d,2}. When an atlas is not given, we propose Hölder IPM test that applies for data distributions with (s,β)-Hölder densities, which achieves the type-II risk in the order of n^-(s+β)/d. To mitigate the heavy computation burden of evaluating the Hölder IPM, we approximate the Hölder function class using neural networks. Based on the approximation theory of neural networks, we show that the neural network IPM test has the type-II risk in the order of n^-(s+β)/d, which is in the same order of the type-II risk as the Hölder IPM test. Our proposed tests are adaptive to low-dimensional geometric structure because their performance crucially depends on the intrinsic dimension instead of the data dimension.


page 1

page 2

page 3

page 4


Kernel MMD Two-Sample Tests for Manifold Data

We present a study of kernel MMD two-sample test statistics in the manif...

Shapley Homology: Topological Analysis of Sample Influence for Neural Networks

Data samples collected for training machine learning models are typicall...

Besov Function Approximation and Binary Classification on Low-Dimensional Manifolds Using Convolutional Residual Networks

Most of existing statistical theories on deep neural networks have sampl...

Classification Logit Two-sample Testing by Neural Networks

The recent success of generative adversarial networks and variational le...

On the capacity of deep generative networks for approximating distributions

We study the efficacy and efficiency of deep generative networks for app...

Tangent-based manifold approximation with locally linear models

In this paper, we consider the problem of manifold approximation with af...

A Convergence Rate for Manifold Neural Networks

High-dimensional data arises in numerous applications, and the rapidly d...

Please sign up or login with your details

Forgot password? Click here to reset