DeepAI AI Chat
Log In Sign Up

Training generative neural networks via Maximum Mean Discrepancy optimization

We consider training a deep neural network to generate samples from an unknown distribution given i.i.d. data. We frame learning as an optimization minimizing a two-sample test statistic---informally speaking, a good generator network produces samples that cause a two-sample test to fail to reject the null hypothesis. As our two-sample test statistic, we use an unbiased estimate of the maximum mean discrepancy, which is the centerpiece of the nonparametric kernel two-sample test proposed by Gretton et al. (2012). We compare to the adversarial nets framework introduced by Goodfellow et al. (2014), in which learning is a two-player game between a generator network and an adversarial discriminator network, both trained to outwit the other. From this perspective, the MMD statistic plays the role of the discriminator. In addition to empirical comparisons, we prove bounds on the generalization error incurred by optimizing the empirical MMD.


k-Sample problem based on generalized maximum mean discrepancy

In this paper we deal with the problem of testing for the quality of k p...

Detecting Adversarial Data by Probing Multiple Perturbations Using Expected Perturbation Score

Adversarial detection aims to determine whether a given sample is an adv...

Nonparametric Two-Sample Hypothesis Testing for Random Graphs with Negative and Repeated Eigenvalues

We propose a nonparametric two-sample test statistic for low-rank, condi...

Neural Tangent Kernel Maximum Mean Discrepancy

We present a novel neural network Maximum Mean Discrepancy (MMD) statist...

Exemplar-based synthesis of geology using kernel discrepancies and generative neural networks

We propose a framework for synthesis of geological images based on an ex...

Generative Models and Model Criticism via Optimized Maximum Mean Discrepancy

We propose a method to optimize the representation and distinguishabilit...

AutoML Two-Sample Test

Two-sample tests are important in statistics and machine learning, both ...