DeepAI AI Chat
Log In Sign Up

Asymptotic Confidence Regions for High-dimensional Structured Sparsity

by   Benjamin Stucky, et al.

In the setting of high-dimensional linear regression models, we propose two frameworks for constructing pointwise and group confidence sets for penalized estimators which incorporate prior knowledge about the organization of the non-zero coefficients. This is done by desparsifying the estimator as in van de Geer et al. [18] and van de Geer and Stucky [17], then using an appropriate estimator for the precision matrix Θ. In order to estimate the precision matrix a corresponding structured matrix norm penalty has to be introduced. After normalization the result is an asymptotic pivot. The asymptotic behavior is studied and simulations are added to study the differences between the two schemes.


page 1

page 2

page 3

page 4


High-dimensional Precision Matrix Estimation with a Known Graphical Structure

A precision matrix is the inverse of a covariance matrix. In this paper,...

High dimensional precision matrix estimation under weak sparsity

In this paper, we estimate the high dimensional precision matrix under t...

Asymptotic Risk of Least Squares Minimum Norm Estimator under the Spike Covariance Model

One of the recent approaches to explain good performance of neural netwo...

Statistical inference for high dimensional regression via Constrained Lasso

In this paper, we propose a new method for estimation and constructing c...

Bootstrap Confidence Regions for Learned Feature Embeddings

Algorithmic feature learners provide high-dimensional vector representat...

Inference for High-dimensional Maximin Effects in Heterogeneous Regression Models Using a Sampling Approach

Heterogeneity is an important feature of modern data sets and a central ...

A Note on High-Dimensional Confidence Regions

Recent advances in statistics introduced versions of the central limit t...