Optimal robust mean and location estimation via convex programs with respect to any pseudo-norms
We consider the problem of robust mean and location estimation w.r.t. any pseudo-norm of the form x∈ℝ^d→ ||x||_S = sup_v∈ S<v,x> where S is any symmetric subset of ℝ^d. We show that the deviation-optimal minimax subgaussian rate for confidence 1-δ is max(l^*(Σ^1/2S)/√(N), sup_v∈ S||Σ^1/2v||_2√(log(1/δ)/N)) where l^*(Σ^1/2S) is the Gaussian mean width of Σ^1/2S and Σ the covariance of the data (in the benchmark i.i.d. Gaussian case). This improves the entropic minimax lower bound from [Lugosi and Mendelson, 2019] and closes the gap characterized by Sudakov's inequality between the entropy and the Gaussian mean width for this problem. This shows that the right statistical complexity measure for the mean estimation problem is the Gaussian mean width. We also show that this rate can be achieved by a solution to a convex optimization problem in the adversarial and L_2 heavy-tailed setup by considering minimum of some Fenchel-Legendre transforms constructed using the Median-of-means principle. We finally show that this rate may also be achieved in situations where there is not even a first moment but a location parameter exists.
READ FULL TEXT