Small Tuning Parameter Selection for the Debiased Lasso

08/18/2022
by   Akira Shinkyu, et al.
0

In this study, we investigate the bias and variance properties of the debiased Lasso in linear regression when the tuning parameter of the node-wise Lasso is selected to be smaller than in previous studies. We consider the case where the number of covariates p is bounded by a constant multiple of the sample size n. First, we show that the bias of the debiased Lasso can be reduced without diverging the asymptotic variance by setting the order of the tuning parameter to 1/√(n).This implies that the debiased Lasso has asymptotic normality provided that the number of nonzero coefficients s_0 satisfies s_0=o(√(n/log p)), whereas previous studies require s_0 =o(√(n)/log p) if no sparsity assumption is imposed on the precision matrix. Second, we propose a data-driven tuning parameter selection procedure for the node-wise Lasso that is consistent with our theoretical results. Simulation studies show that our procedure yields confidence intervals with good coverage properties in various settings. We also present a real economic data example to demonstrate the efficacy of our selection procedure.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2018

Asymptotic Confidence Regions Based on the Adaptive Lasso with Partial Consistent Tuning

We construct confidence sets based on an adaptive Lasso estimator with c...
research
11/04/2019

Online Debiasing for Adaptively Collected High-dimensional Data

Adaptive collection of data is increasingly commonplace in many applicat...
research
11/09/2017

Debiasing the Debiased Lasso with Bootstrap

In this paper, we prove that under proper conditions, bootstrap can furt...
research
08/02/2019

Heterogeneous Endogenous Effects in Networks

This paper proposes a new method to identify leaders and followers in a ...
research
10/09/2021

De-biased Lasso for Generalized Linear Models with A Diverging Number of Covariates

Modeling and drawing inference on the joint associations between single ...
research
01/22/2020

Oracle Efficient Estimation of Structural Breaks in Cointegrating Regressions

In this paper, we propose an adaptive group lasso procedure to efficient...
research
04/04/2018

The noise barrier and the large signal bias of the Lasso and other convex estimators

Convex estimators such as the Lasso, the matrix Lasso and the group Lass...

Please sign up or login with your details

Forgot password? Click here to reset