Variable selection with multiply-imputed datasets: choosing between stacked and grouped methods

03/16/2020
by   Jiacong Du, et al.
0

Penalized regression methods, such as lasso and elastic net, are used in many biomedical applications when simultaneous regression coefficient estimation and variable selection is desired. However, missing data complicates the implementation of these methods, particularly when missingness is handled using multiple imputation. Applying a variable selection algorithm on each imputed dataset will likely lead to different sets of selected predictors, making it difficult to ascertain a final active set without resorting to ad hoc combination rules. In this paper we consider a general class of penalized objective functions which, by construction, force selection of the same variables across multiply-imputed datasets. By pooling objective functions across imputations, optimization is then performed jointly over all imputed datasets rather than separately for each dataset. We consider two objective function formulations that exist in the literature, which we will refer to as "stacked" and "grouped" objective functions. Building on existing work, we (a) derive and implement efficient cyclic coordinate descent and majorization-minimization optimization algorithms for both continuous and binary outcome data, (b) incorporate adaptive shrinkage penalties, (c) compare these methods through simulation, and (d) develop an R package miselect for easy implementation. Simulations demonstrate that the "stacked" objective function approaches tend to be more computationally efficient and have better estimation and selection properties. We apply these methods to data from the University of Michigan ALS Patients Repository (UMAPR) which aims to identify the association between persistent organic pollutants and ALS risk.

READ FULL TEXT
research
10/31/2022

Variable Selection for Multiply-imputed Data: A Bayesian Framework

Multiple imputation is a widely used technique to handle missing data in...
research
06/29/2020

Penalized regression with multiple loss functions and selection by vote

This article considers a linear model in a high dimensional data scenari...
research
02/25/2022

Flexible variable selection in the presence of missing data

In many applications, it is of interest to identify a parsimonious set o...
research
09/19/2017

varbvs: Fast Variable Selection for Large-scale Regression

We introduce varbvs, a suite of functions written in R and MATLAB for re...
research
08/31/2020

Variable selection in social-environmental data: Sparse regression and tree ensemble machine learning approaches

Objective: Social-environmental data obtained from the U.S. Census is an...
research
05/16/2022

ecpc: An R-package for generic co-data models for high-dimensional prediction

High-dimensional prediction considers data with more variables than samp...
research
02/28/2023

An Algorithm and Complexity Results for Causal Unit Selection

The unit selection problem aims to identify objects, called units, that ...

Please sign up or login with your details

Forgot password? Click here to reset