DeepAI AI Chat
Log In Sign Up

Asymptotic Confidence Regions for High-dimensional Structured Sparsity

06/28/2017
by   Benjamin Stucky, et al.
0

In the setting of high-dimensional linear regression models, we propose two frameworks for constructing pointwise and group confidence sets for penalized estimators which incorporate prior knowledge about the organization of the non-zero coefficients. This is done by desparsifying the estimator as in van de Geer et al. [18] and van de Geer and Stucky [17], then using an appropriate estimator for the precision matrix Θ. In order to estimate the precision matrix a corresponding structured matrix norm penalty has to be introduced. After normalization the result is an asymptotic pivot. The asymptotic behavior is studied and simulations are added to study the differences between the two schemes.

READ FULL TEXT

page 1

page 2

page 3

page 4

06/28/2021

High-dimensional Precision Matrix Estimation with a Known Graphical Structure

A precision matrix is the inverse of a covariance matrix. In this paper,...
07/07/2021

High dimensional precision matrix estimation under weak sparsity

In this paper, we estimate the high dimensional precision matrix under t...
12/31/2019

Asymptotic Risk of Least Squares Minimum Norm Estimator under the Spike Covariance Model

One of the recent approaches to explain good performance of neural netwo...
04/17/2017

Statistical inference for high dimensional regression via Constrained Lasso

In this paper, we propose a new method for estimation and constructing c...
02/01/2022

Bootstrap Confidence Regions for Learned Feature Embeddings

Algorithmic feature learners provide high-dimensional vector representat...
11/15/2020

Inference for High-dimensional Maximin Effects in Heterogeneous Regression Models Using a Sampling Approach

Heterogeneity is an important feature of modern data sets and a central ...
05/19/2021

A Note on High-Dimensional Confidence Regions

Recent advances in statistics introduced versions of the central limit t...