Classification Logit Two-sample Testing by Neural Networks

09/25/2019
by   Xiuyuan Cheng, et al.
0

The recent success of generative adversarial networks and variational learning suggests training a classifier network may work well in addressing the classical two-sample problem. Network-based tests have the computational advantage that the algorithm scales to large samples. This paper proposes to use the difference of the logit of a trained neural network classifier evaluated on the two finite samples as the test statistic. Theoretically, we prove the testing power to differentiate two smooth densities given that the network is sufficiently parametrized, by comparing the learned logit function to the log ratio of the densities, the latter maximizing the population training objective. When the two densities lie on or near to low-dimensional manifolds embedded in possibly high-dimensional space, the needed network complexity is reduced to only depending on the intrinsic manifold geometry. In experiments, the method demonstrates better performance than previous network-based tests which use the classification accuracy as the test statistic, and compares favorably to certain kernel maximum mean discrepancy (MMD) tests on synthetic and hand-written digits datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/06/2021

Neural Tangent Kernel Maximum Mean Discrepancy

We present a novel neural network Maximum Mean Discrepancy (MMD) statist...
research
05/07/2021

Kernel MMD Two-Sample Tests for Manifold Data

We present a study of kernel MMD two-sample test statistics in the manif...
research
02/10/2021

An Optimal Witness Function for Two-Sample Testing

We propose data-dependent test statistics based on a one-dimensional wit...
research
05/04/2022

A Manifold Two-Sample Test Study: Integral Probability Metric with Neural Networks

Two-sample tests are important areas aiming to determine whether two col...
research
09/17/2019

Two-Sample Test Based on Classification Probability

Robust classification algorithms have been developed in recent years wit...
research
05/31/2018

Ratio Matching MMD Nets: Low dimensional projections for effective deep generative models

Deep generative models can learn to generate realistic-looking images on...
research
09/05/2023

Maximum Mean Discrepancy Meets Neural Networks: The Radon-Kolmogorov-Smirnov Test

Maximum mean discrepancy (MMD) refers to a general class of nonparametri...

Please sign up or login with your details

Forgot password? Click here to reset