Bias-Aware Inference in Regularized Regression Models
We consider inference on a regression coefficient under a constraint on the magnitude of the control coefficients. We show that a class of estimators based on an auxiliary regularized regression of the regressor of interest on control variables exactly solves a tradeoff between worst-case bias and variance. We derive "bias-aware" confidence intervals (CIs) based on these estimators, which take into account possible bias when forming the critical value. We show that these estimators and CIs are near-optimal in finite samples for mean squared error and CI length. Our finite-sample results are based on an idealized setting with normal regression errors with known homoskedastic variance, and we provide conditions for asymptotic validity with unknown and possibly heteroskedastic error distribution. Focusing on the case where the constraint on the magnitude of control coefficients is based on an ℓ_p norm (p≥ 1), we derive rates of convergence for optimal estimators and CIs under high-dimensional asymptotics that allow the number of regressors to increase more quickly than the number of observations.
READ FULL TEXT