Structured Sparsity and Generalization

08/17/2011
by   Andreas Maurer, et al.
0

We present a data dependent generalization bound for a large class of regularized algorithms which implement structured sparsity constraints. The bound can be applied to standard squared-norm regularization, the Lasso, the group Lasso, some versions of the group Lasso with overlapping groups, multiple kernel learning and other regularization schemes. In all these cases competitive results are obtained. A novel feature of our bound is that it can be applied in an infinite dimensional setting such as the Lasso in a separable Hilbert space or multiple kernel learning with a countable number of kernels.

READ FULL TEXT
research
11/16/2022

The non-overlapping statistical approximation to overlapping group lasso

Group lasso is a commonly used regularization method in statistical lear...
research
02/01/2019

On the Closed-form Proximal Mapping and Efficient Algorithms for Exclusive Lasso Models

The exclusive lasso regularization based on the ℓ_1,2 norm has become po...
research
10/13/2010

Online Multiple Kernel Learning for Structured Prediction

Despite the recent progress towards efficient multiple kernel learning (...
research
08/08/2020

Error Bounds for Generalized Group Sparsity

In high-dimensional statistical inference, sparsity regularizations have...
research
05/04/2011

Structured Sparsity via Alternating Direction Methods

We consider a class of sparse learning problems in high dimensional feat...
research
03/28/2022

Infinite-Dimensional Sparse Learning in Linear System Identification

Regularized methods have been widely applied to system identification pr...
research
04/07/2011

Efficient First Order Methods for Linear Composite Regularizers

A wide class of regularization problems in machine learning and statisti...

Please sign up or login with your details

Forgot password? Click here to reset