A New Perspective on Debiasing Linear Regressions

04/08/2021
by   Yufei Yi, et al.
0

In this paper, we propose an abstract procedure for debiasing constrained or regularized potentially high-dimensional linear models. It is elementary to show that the proposed procedure can produce 1/√(n)-confidence intervals for individual coordinates (or even bounded contrasts) in models with unknown covariance, provided that the covariance has bounded spectrum. While the proof of the statistical guarantees of our procedure is simple, its implementation requires more care due to the complexity of the optimization programs we need to solve. We spend the bulk of this paper giving examples in which the proposed algorithm can be implemented in practice. One fairly general class of instances which are amenable to applications of our procedure include convex constrained least squares. We are able to translate the procedure to an abstract algorithm over this class of models, and we give concrete examples where efficient polynomial time methods for debiasing exist. Those include the constrained version of LASSO, regression under monotone constraints, regression with positive monotone constraints and non-negative least squares. In addition, we show that our abstract procedure can be applied to efficiently debias SLOPE and square-root SLOPE, among other popular regularized procedures under certain assumptions. We provide thorough simulation results in support of our theoretical findings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/28/2022

Polynomial time guarantees for sampling based posterior inference in high-dimensional generalised linear models

The problem of computing posterior functionals in general high-dimension...
research
11/06/2020

Estimation, Confidence Intervals, and Large-Scale Hypotheses Testing for High-Dimensional Mixed Linear Regression

This paper studies the high-dimensional mixed linear regression (MLR) wh...
research
01/20/2022

Noisy linear inverse problems under convex constraints: Exact risk asymptotics in high dimensions

In the standard Gaussian linear measurement model Y=Xμ_0+ξ∈ℝ^m with a fi...
research
02/15/2022

Accelerating Non-Negative and Bounded-Variable Linear Regression Algorithms with Safe Screening

Non-negative and bounded-variable linear regression problems arise in a ...
research
05/04/2012

Non-negative least squares for high-dimensional linear models: consistency and sparse recovery without regularization

Least squares fitting is in general not useful for high-dimensional line...
research
11/14/2022

On the generalization error of norm penalty linear regression models

We study linear regression problems inf_β∈ℝ^d(𝔼_ℙ_n[|Y - 𝐗^⊤β|^r])^1/r +...

Please sign up or login with your details

Forgot password? Click here to reset