Inference in High-dimensional Linear Regression
We develop an approach to inference in a linear regression model when the number of potential explanatory variables is larger than the sample size. Our approach treats each regression coefficient in turn as the interest parameter, the remaining coefficients being nuisance parameters, and seeks an optimal interest-respecting transformation. The role of this transformation is to allow a marginal least squares analysis for each variable, as in a factorial experiment. One parameterization of the problem is found to be particularly convenient, both computationally and mathematically. In particular, it permits an analytic solution to the optimal transformation problem, facilitating comparison to other work. In contrast to regularized regression such as the lasso (Tibshirani, 1996) and its extensions, neither adjustment for selection, nor rescaling of the explanatory variables is needed, ensuring the physical interpretation of regression coefficients is retained. We discuss the use of such confidence intervals as part of a broader set of inferential statements, so as to reflect uncertainty over the model as well as over the parameters. The considerations involved in extending the work to other regression models are briefly discussed.
READ FULL TEXT