High Dimensional Generalised Penalised Least Squares

07/14/2022
by   Ilias Chronopoulos, et al.
0

In this paper we develop inference for high dimensional linear models, with serially correlated errors. We examine Lasso under the assumption of strong mixing in the covariates and error process, allowing for fatter tails in their distribution. While the Lasso estimator performs poorly under such circumstances, we estimate via GLS Lasso the parameters of interest and extend the asymptotic properties of the Lasso under more general conditions. Our theoretical results indicate that the non-asymptotic bounds for stationary dependent processes are sharper, while the rate of Lasso under general conditions appears slower as T,p→∞. Further we employ the debiased Lasso to perform inference uniformly on the parameters of interest. Monte Carlo results support the proposed estimator, as it has significant efficiency gains over traditional methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/21/2020

Lasso Inference for High-Dimensional Time Series

The desparsified lasso is a high-dimensional estimation method which pro...
research
02/21/2016

Estimating Structured Vector Autoregressive Model

While considerable advances have been made in estimating high-dimensiona...
research
10/23/2020

Design of c-Optimal Experiments for High dimensional Linear Models

We study random designs that minimize the asymptotic variance of a de-bi...
research
03/20/2019

Omitted variable bias of Lasso-based inference methods under limited variability: A finite sample analysis

We study the finite sample behavior of Lasso and Lasso-based inference m...
research
09/07/2022

Local Projection Inference in High Dimensions

In this paper, we estimate impulse responses by local projections in hig...
research
03/20/2019

Behavior of Lasso and Lasso-based inference under limited variability

We study the nonasymptotic behavior of Lasso and Lasso-based inference w...
research
03/14/2015

Communication-efficient sparse regression: a one-shot approach

We devise a one-shot approach to distributed sparse regression in the hi...

Please sign up or login with your details

Forgot password? Click here to reset