DeepAI AI Chat
Log In Sign Up

Fast Penalized Regression and Cross Validation for Tall Data with the oem Package

by   Jared D. Huling, et al.

A large body of research has focused on theory and computation for variable selection techniques for high dimensional data. There has been substantially less work in the big tall data paradigm, where the number of variables may be large, but the number of observations is much larger. The orthogonalizing expectation maximization (OEM) algorithm is one approach for computation of penalized models which excels in the big tall data regime. The oem package is an efficient implementation of the OEM algorithm which provides a multitude of computation routines with a focus on big tall data, such as a function for out-of-memory computation, for large-scale parallel computation of penalized regression models. Furthermore, in this paper we propose a specialized implementation of the OEM algorithm for cross validation, dramatically reducing the computing time for cross validation over a naive implementation.


page 1

page 2

page 3

page 4


Cross validation approaches for penalized Cox regression

Cross validation is commonly used for selecting tuning parameters in pen...

Consistent selection of tuning parameters via variable selection stability

Penalized regression models are popularly used in high-dimensional data ...

The biglasso Package: A Memory- and Computation-Efficient Solver for Lasso Model Fitting with Big Data in R

Penalized regression models such as the lasso have been extensively appl...

Penalized estimation in large-scale generalized linear array models

Large-scale generalized linear array models (GLAMs) can be challenging t...

On the Use of C-index for Stratified and Cross-Validated Cox Model

We develop a baseline-adjusted C-index to evaluate fitted Cox proportion...