Rates of convergence for density estimation with GANs
We undertake a precise study of the non-asymptotic properties of vanilla generative adversarial networks (GANs) and derive theoretical guarantees in the problem of estimating an unknown d-dimensional density p^* under a proper choice of the class of generators and discriminators. We prove that the resulting density estimate converges to p^* in terms of Jensen-Shannon (JS) divergence at the rate (log n/n)^2β/(2β+d) where n is the sample size and β determines the smoothness of p^*. This is the first result in the literature on density estimation using vanilla GANs with JS rates faster than n^-1/2 in the regime β>d/2.
READ FULL TEXT