Fast Penalized Regression and Cross Validation for Tall Data with the oem Package

01/29/2018
by   Jared D. Huling, et al.
0

A large body of research has focused on theory and computation for variable selection techniques for high dimensional data. There has been substantially less work in the big tall data paradigm, where the number of variables may be large, but the number of observations is much larger. The orthogonalizing expectation maximization (OEM) algorithm is one approach for computation of penalized models which excels in the big tall data regime. The oem package is an efficient implementation of the OEM algorithm which provides a multitude of computation routines with a focus on big tall data, such as a function for out-of-memory computation, for large-scale parallel computation of penalized regression models. Furthermore, in this paper we propose a specialized implementation of the OEM algorithm for cross validation, dramatically reducing the computing time for cross validation over a naive implementation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2019

Cross validation approaches for penalized Cox regression

Cross validation is commonly used for selecting tuning parameters in pen...
research
08/16/2012

Consistent selection of tuning parameters via variable selection stability

Penalized regression models are popularly used in high-dimensional data ...
research
01/20/2017

The biglasso Package: A Memory- and Computation-Efficient Solver for Lasso Model Fitting with Big Data in R

Penalized regression models such as the lasso have been extensively appl...
research
10/12/2015

Penalized estimation in large-scale generalized linear array models

Large-scale generalized linear array models (GLAMs) can be challenging t...
research
02/07/2008

Least angle and ℓ_1 penalized regression: A review

Least Angle Regression is a promising technique for variable selection a...

Please sign up or login with your details

Forgot password? Click here to reset