Hierarchical Sparse Modeling: A Choice of Two Group Lasso Formulations

12/05/2015
by   Xiaohan Yan, et al.
0

Demanding sparsity in estimated models has become a routine practice in statistics. In many situations, we wish to require that the sparsity patterns attained honor certain problem-specific constraints. Hierarchical sparse modeling (HSM) refers to situations in which these constraints specify that one set of parameters be set to zero whenever another is set to zero. In recent years, numerous papers have developed convex regularizers for this form of sparsity structure, which arises in many areas of statistics including interaction modeling, time series analysis, and covariance estimation. In this paper, we observe that these methods fall into two frameworks, the group lasso (GL) and latent overlapping group lasso (LOG), which have not been systematically compared in the context of HSM. The purpose of this paper is to provide a side-by-side comparison of these two frameworks for HSM in terms of their statistical properties and computational efficiency. We call special attention to GL's more aggressive shrinkage of parameters deep in the hierarchy, a property not shared by LOG. In terms of computation, we introduce a finite-step algorithm that exactly solves the proximal operator of LOG for a certain simple HSM structure; we later exploit this to develop a novel path-based block coordinate descent scheme for general HSM structures. Both algorithms greatly improve the computational performance of LOG. Finally, we compare the two methods in the context of covariance estimation, where we introduce a new sparsely-banded estimator using LOG, which we show achieves the statistical advantages of an existing GL-based method but is simpler to express and more efficient to compute.

READ FULL TEXT

page 6

page 14

research
06/26/2011

A General Framework for Structured Sparsity via Proximal Optimization

We study a generalized framework for structured sparsity. It extends the...
research
09/10/2013

Compressed Sensing for Block-Sparse Smooth Signals

We present reconstruction algorithms for smooth signals with block spars...
research
12/21/2019

An error bound for Lasso and Group Lasso in high dimensions

We leverage recent advances in high-dimensional statistics to derive new...
research
01/10/2020

A first-order optimization algorithm for statistical learning with hierarchical sparsity structure

In many statistical learning problems, it is desired that the optimal so...
research
05/25/2021

Group selection and shrinkage with application to sparse semiparametric modeling

Sparse regression and classification estimators capable of group selecti...
research
06/14/2023

Flexible Krylov Methods for Group Sparsity Regularization

This paper introduces new solvers for efficiently computing solutions to...
research
05/22/2012

A lasso for hierarchical interactions

We add a set of convex constraints to the lasso to produce sparse intera...

Please sign up or login with your details

Forgot password? Click here to reset