Group selection and shrinkage with application to sparse semiparametric modeling

05/25/2021
by   Ryan Thompson, et al.
12

Sparse regression and classification estimators capable of group selection have application to an assortment of statistical problems, from multitask learning to sparse additive modeling to hierarchical selection. This work introduces a class of group-sparse estimators that combine group subset selection with group lasso or ridge shrinkage. We develop an optimization framework for fitting the nonconvex regularization surface and present finite-sample error bounds for estimation of the regression function. Our methods and analyses accommodate the general setting where groups overlap. As an application of group selection, we study sparse semiparametric modeling, a procedure that allows the effect of each predictor to be zero, linear, or nonlinear. For this task, the new estimators improve across several metrics on synthetic data compared to alternatives. Finally, we demonstrate their efficacy in modeling supermarket foot traffic and economic recessions using many predictors. All of our proposals are made available in the scalable implementation grpsel.

READ FULL TEXT
research
09/10/2012

Group descent algorithms for nonconvex penalized linear and logistic regression models with grouped predictors

Penalized regression is an attractive framework for variable selection p...
research
07/10/2021

Cluster Regularization via a Hierarchical Feature Regression

Prediction tasks with high-dimensional nonorthogonal predictor sets pose...
research
12/13/2018

Split regression modeling

In this note we study the benefits of splitting variables variables for ...
research
02/18/2014

Classification with Sparse Overlapping Groups

Classification with a sparsity constraint on the solution plays a centra...
research
08/10/2017

Subset Selection with Shrinkage: Sparse Linear Modeling when the SNR is low

We study the behavior of a fundamental tool in sparse statistical modeli...
research
11/20/2013

Sparse Overlapping Sets Lasso for Multitask Learning and its Application to fMRI Analysis

Multitask learning can be effective when features useful in one task are...
research
12/05/2015

Hierarchical Sparse Modeling: A Choice of Two Group Lasso Formulations

Demanding sparsity in estimated models has become a routine practice in ...

Please sign up or login with your details

Forgot password? Click here to reset