On the minimax rate of the Gaussian sequence model under bounded convex constraints

by   Matey Neykov, et al.

We determine the exact minimax rate of a Gaussian sequence model under bounded convex constraints, purely in terms of the local geometry of the given constraint set K. Our main result shows that the minimax risk (up to constant factors) under the squared L_2 loss is given by ϵ^*2∧diam(K)^2 with ϵ^* = sup{ϵ : ϵ^2/σ^2≤log M^loc(ϵ)}, where log M^loc(ϵ) denotes the local entropy of the set K, and σ^2 is the variance of the noise. We utilize our abstract result to re-derive known minimax rates for some special sets K such as hyperrectangles, ellipses, and more generally quadratically convex orthosymmetric sets. Finally, we extend our results to the unbounded case with known σ^2 to show that the minimax rate in that case is ϵ^*2.



page 1

page 2

page 3

page 4


Minimax bounds for estimating multivariate Gaussian location mixtures

We prove minimax bounds for estimating Gaussian location mixtures on ℝ^d...

A Tight Excess Risk Bound via a Unified PAC-Bayesian-Rademacher-Shtarkov-MDL Complexity

We present a novel notion of complexity that interpolates between and ge...

Optimal estimation of variance in nonparametric regression with random design

Consider the heteroscedastic nonparametric regression model with random ...

Convex Regression in Multidimensions: Suboptimality of Least Squares Estimators

The least squares estimator (LSE) is shown to be suboptimal in squared e...

Minimax rates for sparse signal detection under correlation

We fully characterize the nonasymptotic minimax separation rate for spar...

Minimax Optimal Bayesian Aggregation

It is generally believed that ensemble approaches, which combine multipl...

Dominating Points of Gaussian Extremes

We quantify the large deviations of Gaussian extreme value statistics on...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.