Lower bounds for the trade-off between bias and mean absolute deviation
It is a widely observed phenomenon in nonparametric statistics that rate-optimal estimators balance bias and stochastic error. The recent work on overparametrization raises the question whether rate-optimal estimators exist that do not obey this trade-off. In this work we consider pointwise estimation in the Gaussian white noise model with β-Hölder smooth regression function f. It is shown that an estimator with worst-case bias ≲ n^-β/(2β+1)=: ψ_n must necessarily also have a worst-case mean absolute deviation that is lower bounded by ≳ψ_n. This proves that any estimator achieving the minimax optimal pointwise estimation rate ψ_n must necessarily balance worst-case bias and worst-case mean absolute deviation. To derive the result, we establish an abstract inequality relating the change of expectation for two probability measures to the mean absolute deviation.
READ FULL TEXT