Ridge Regression Revisited: Debiasing, Thresholding and Bootstrap

09/17/2020
by   Yunyi Zhang, et al.
0

In high dimensional setting, the facts that the classical ridge regression method cannot perform model selection on its own and it introduces large bias make this method an unsatisfactory tool for analyzing high dimensional linear models. In this paper, we propose the debiased and threshold ridge regression method which solves these drawbacks. Besides, focus on performing statistical inference and prediction of linear combinations of parameters, we provide a normal approximation theorem for the estimator and propose two bootstrap algorithms which provide joint confidence regions and prediction regions for the linear combinations. In statistical inference part, apart from the dimension of parameters, we allow the number of linear combinations to grow as sample size increases. From numerical experiments, we can see that the proposed regression method is robust with the fluctuation in ridge parameter and reduces estimation errors compared to classical and threshold ridge regression methods. Apart from theoretical interests, the proposed algorithms can be applied to disciplines such as econometrics, biology and etc.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset