Support union recovery in high-dimensional multivariate regression

08/05/2008
by   Guillaume Obozinski, et al.
0

In multivariate regression, a K-dimensional response vector is regressed upon a common set of p covariates, with a matrix B^*∈R^p× K of regression coefficients. We study the behavior of the multivariate group Lasso, in which block regularization based on the ℓ_1/ℓ_2 norm is used for support union recovery, or recovery of the set of s rows for which B^* is nonzero. Under high-dimensional scaling, we show that the multivariate group Lasso exhibits a threshold for the recovery of the exact row pattern with high probability over the random design and noise that is specified by the sample complexity parameter θ(n,p,s):=n/[2ψ(B^*)(p-s)]. Here n is the sample size, and ψ(B^*) is a sparsity-overlap function measuring a combination of the sparsities and overlaps of the K-regression coefficient vectors that constitute the model. We prove that the multivariate group Lasso succeeds for problem sequences (n,p,s) such that θ(n,p,s) exceeds a critical level θ_u, and fails for sequences such that θ(n,p,s) lies below a critical level θ_ℓ. For the special case of the standard Gaussian ensemble, we show that θ_ℓ=θ_u so that the characterization is sharp. The sparsity-overlap function ψ(B^*) reveals that, if the design is uncorrelated on the active rows, ℓ_1/ℓ_2 regularization for multivariate regression never harms performance relative to an ordinary Lasso approach and can yield substantial improvements in sample complexity (up to a factor of K) when the coefficient vectors are suitably orthogonal. For more general designs, it is possible for the ordinary Lasso to outperform the multivariate group Lasso. We complement our analysis with simulations that demonstrate the sharpness of our theoretical results, even for relatively small problems.

READ FULL TEXT
research
07/30/2013

Sharp Threshold for Multivariate Multi-Response Linear Regression via Block Regularized Lasso

In this paper, we investigate a multivariate multi-response (MVMR) linea...
research
10/17/2012

Mixture model for designs in high dimensional regression and the LASSO

The LASSO is a recent technique for variable selection in the regression...
research
05/02/2013

Model Selection for High-Dimensional Regression under the Generalized Irrepresentability Condition

In the high-dimensional regression model a response variable is linearly...
research
06/08/2018

The Well Tempered Lasso

We study the complexity of the entire regularization path for least squa...
research
11/25/2015

L1-Regularized Least Squares for Support Recovery of High Dimensional Single Index Models with Gaussian Designs

It is known that for a certain class of single index models (SIMs) Y = f...
research
03/20/2019

Behavior of Lasso and Lasso-based inference under limited variability

We study the nonasymptotic behavior of Lasso and Lasso-based inference w...
research
09/14/2012

Signal Recovery in Unions of Subspaces with Applications to Compressive Imaging

In applications ranging from communications to genetics, signals can be ...

Please sign up or login with your details

Forgot password? Click here to reset