Method of Contraction-Expansion (MOCE) for Simultaneous Inference in Linear Models

08/04/2019
by   Fei Wang, et al.
2

Simultaneous inference after model selection is of critical importance to address scientific hypotheses involving a set of parameters. In this paper, we consider high-dimensional linear regression model in which a regularization procedure such as LASSO is applied to yield a sparse model. To establish a simultaneous post-model selection inference, we propose a method of contraction and expansion (MOCE) along the line of debiasing estimation that enables us to balance the bias-and-variance trade-off so that the super-sparsity assumption may be relaxed. We establish key theoretical results for the proposed MOCE procedure from which the expanded model can be selected with theoretical guarantees and simultaneous confidence regions can be constructed by the joint asymptotic normal distribution. In comparison with existing methods, our proposed method exhibits stable and reliable coverage at a nominal significance level with substantially less computational burden, and thus it is trustworthy for its application in solving real-world problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/11/2022

High-dimensional properties for empirical priors in linear regression with unknown error variance

We study full Bayesian procedures for high-dimensional linear regression...
research
07/14/2023

Sparsified Simultaneous Confidence Intervals for High-Dimensional Linear Models

Statistical inference of the high-dimensional regression coefficients is...
research
02/28/2023

Debiased Lasso After Sample Splitting for Estimation and Inference in High Dimensional Generalized Linear Models

We consider random sample splitting for estimation and inference in high...
research
05/20/2020

Simultaneous Confidence Tubes for Comparison of Several Multivariate Linear Regression Models

Much of the research on multiple comparison and simultaneous inference i...
research
06/05/2022

Rotation to Sparse Loadings using L^p Losses and Related Inference Problems

Exploratory factor analysis (EFA) has been widely used to learn the late...
research
10/05/2018

Asymptotic Confidence Regions Based on the Adaptive Lasso with Partial Consistent Tuning

We construct confidence sets based on an adaptive Lasso estimator with c...
research
09/11/2020

Narrowest Significance Pursuit: inference for multiple change-points in linear models

We propose Narrowest Significance Pursuit (NSP), a general and flexible ...

Please sign up or login with your details

Forgot password? Click here to reset