Enhanced Balancing of Bias-Variance Tradeoff in Stochastic Estimation: A Minimax Perspective

02/12/2019
by   Henry Lam, et al.
0

Biased stochastic estimators, such as finite-differences for noisy gradient estimation, often contain parameters that need to be properly chosen to balance impacts from the bias and the variance. While the optimal order of these parameters in terms of the simulation budget can be readily established, the precise best values depend on model characteristics that are typically unknown in advance. We introduce a framework to construct new classes of estimators, based on judicious combinations of simulation runs on sequences of tuning parameter values, such that the estimators consistently outperform a given tuning parameter choice in the conventional approach, regardless of the unknown model characteristics. We argue the outperformance via what we call the asymptotic minimax risk ratio, obtained by minimizing the worst-case asymptotic ratio between the mean square errors of our estimators and the conventional one, where the worst case is over any possible values of the model unknowns. In particular, when the minimax ratio is less than 1, the calibrated estimator is guaranteed to perform better asymptotically. We identify this minimax ratio for general classes of weighted estimators, and the regimes where this ratio is less than 1. Moreover, we show that the best weighting scheme is characterized by a sum of two components with distinct decay rates. We explain how this arises from bias-variance balancing that combats the adversarial selection of the model constants, which can be analyzed via a tractable reformulation of a non-convex optimization problem.

READ FULL TEXT
research
03/21/2023

Lower bounds for the trade-off between bias and mean absolute deviation

It is a widely observed phenomenon in nonparametric statistics that rate...
research
05/23/2023

Adapting to Misspecification

Empirical research typically involves a robustness-efficiency tradeoff. ...
research
06/19/2020

Learning Minimax Estimators via Online Learning

We consider the problem of designing minimax estimators for estimating t...
research
12/29/2020

Bias-Aware Inference in Regularized Regression Models

We consider inference on a regression coefficient under a constraint on ...
research
03/17/2022

Outcome Assumptions and Duality Theory for Balancing Weights

We study balancing weight estimators, which reweight outcomes from a sou...
research
01/03/2023

A Distributionally Robust Optimization Framework for Extreme Event Estimation

Conventional methods for extreme event estimation rely on well-chosen pa...
research
06/19/2020

Minimax rates without the fixed sample size assumption

We generalize the notion of minimax convergence rate. In contrast to the...

Please sign up or login with your details

Forgot password? Click here to reset