Optimal Subsampling for Large Sample Ridge Regression

04/10/2022
by   Yunlu Chen, et al.
0

Subsampling is a popular approach to alleviating the computational burden for analyzing massive datasets. Recent efforts have been devoted to various statistical models without explicit regularization. In this paper, we develop an efficient subsampling procedure for the large sample linear ridge regression. In contrast to the ordinary least square estimator, the introduction of the ridge penalty leads to a subtle trade-off between bias and variance. We first investigate the asymptotic properties of the subsampling estimator and then propose to minimize the asymptotic-mean-squared-error criterion for optimality. The resulting subsampling probability involves both ridge leverage score and L2 norm of the predictor. To further reduce the computational cost for calculating the ridge leverage scores, we propose the algorithm with efficient approximation. We show by synthetic and real datasets that the algorithm is both statistically accurate and computationally efficient compared with existing subsampling based methods.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset