Debiasing the Debiased Lasso with Bootstrap

11/09/2017
by   Sai Li, et al.
0

In this paper, we prove that under proper conditions, bootstrap can further debias the debiased Lasso estimator for statistical inference of low-dimensional parameters in high-dimensional linear regression. We prove that the required sample size for inference with bootstrapped debiased Lasso, which involves the number of small coefficients, can be of smaller order than the existing ones for the debiased Lasso. Therefore, our results reveal the benefits of having strong signals. Our theory is supported by results of simulation experiments, which compare coverage probabilities and lengths of confidence intervals with and without bootstrap, with and without debiasing.

READ FULL TEXT
research
09/20/2020

Confidence intervals for parameters in high-dimensional sparse vector autoregression

Vector autoregression (VAR) models are widely used to analyze the interr...
research
11/04/2019

Online Debiasing for Adaptively Collected High-dimensional Data

Adaptive collection of data is increasingly commonplace in many applicat...
research
03/20/2019

Behavior of Lasso and Lasso-based inference under limited variability

We study the nonasymptotic behavior of Lasso and Lasso-based inference w...
research
10/29/2017

Distributional Consistency of Lasso by Perturbation Bootstrap

Least Absolute Shrinkage and Selection Operator or the Lasso, introduced...
research
08/18/2022

Small Tuning Parameter Selection for the Debiased Lasso

In this study, we investigate the bias and variance properties of the de...
research
02/19/2021

Distributed Bootstrap for Simultaneous Inference Under High Dimensionality

We propose a distributed bootstrap method for simultaneous inference on ...

Please sign up or login with your details

Forgot password? Click here to reset