Adaptation in multivariate log-concave density estimation

12/30/2018
by   Oliver Y. Feng, et al.
0

We study the adaptation properties of the multivariate log-concave maximum likelihood estimator over two subclasses of log-concave densities. The first consists of densities with polyhedral support whose logarithms are piecewise affine. The complexity of such densities f can be measured in terms of the sum Γ(f) of the numbers of facets of the subdomains in the polyhedral subdivision of the support induced by f. Given n independent observations from a d-dimensional log-concave density with d ∈{2,3}, we prove a sharp oracle inequality, which in particular implies that the Kullback--Leibler risk of the log-concave maximum likelihood estimator for such densities is bounded above by Γ(f)/n, up to a polylogarithmic factor. Thus, the rate can be essentially parametric, even in this multivariate setting. The second type of subclass consists of densities whose contours are well-separated; these new classes are constructed to be affine invariant and turn out to contain a wide variety of densities, including those that satisfy Hölder regularity conditions. Here, we prove another sharp oracle inequality, which reveals in particular that the log-concave maximum likelihood estimator attains a Kullback--Leibler risk bound of order n^-(β+3/β+7,4/7) when d=3 over the class of β-Hölder log-concave densities with β > 1, again up to a polylogarithmic factor.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset