Nonparametric Density Estimation under Adversarial Losses

05/22/2018
by   Shashank Singh, et al.
0

We study minimax convergence rates of nonparametric density estimation under a large class of loss functions called "adversarial losses", which, besides classical L^p losses, includes maximum mean discrepancy (MMD), Wasserstein distance, and total variation distance. These losses are closely related to the losses encoded by discriminator networks in generative adversarial networks (GANs). In a general framework, we study how the choice of loss and the assumed smoothness of the underlying density together determine the minimax rate. We also discuss implications for training GANs based on deep ReLU networks, and more general connections to learning implicit generative models in a minimax statistical sense.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/18/2020

Robust Density Estimation under Besov IPM Losses

We study minimax convergence rates of nonparametric density estimation i...
research
01/30/2021

Rates of convergence for density estimation with GANs

We undertake a precise study of the non-asymptotic properties of vanilla...
research
02/09/2019

Nonparametric Density Estimation under Besov IPM Losses

We study the problem of estimating a nonparametric probability distribut...
research
02/02/2022

Robust Estimation for Nonparametric Families via Generative Adversarial Networks

We provide a general framework for designing Generative Adversarial Netw...
research
10/04/2018

Robust Estimation and Generative Adversarial Nets

Robust estimation under Huber's ϵ-contamination model has become an impo...
research
12/21/2017

How Well Can Generative Adversarial Networks (GAN) Learn Densities: A Nonparametric View

We study in this paper the rate of convergence for learning densities un...
research
01/25/2019

Towards a Deeper Understanding of Adversarial Losses

Recent work has proposed various adversarial losses for training generat...

Please sign up or login with your details

Forgot password? Click here to reset