Sharp Threshold for Multivariate Multi-Response Linear Regression via Block Regularized Lasso

07/30/2013
by   Weiguang Wang, et al.
0

In this paper, we investigate a multivariate multi-response (MVMR) linear regression problem, which contains multiple linear regression models with differently distributed design matrices, and different regression and output vectors. The goal is to recover the support union of all regression vectors using l_1/l_2-regularized Lasso. We characterize sufficient and necessary conditions on sample complexity as a sharp threshold to guarantee successful recovery of the support union. Namely, if the sample size is above the threshold, then l_1/l_2-regularized Lasso correctly recovers the support union; and if the sample size is below the threshold, l_1/l_2-regularized Lasso fails to recover the support union. In particular, the threshold precisely captures the impact of the sparsity of regression vectors and the statistical properties of the design matrices on sample complexity. Therefore, the threshold function also captures the advantages of joint support union recovery using multi-task Lasso over individual support recovery using single-task Lasso.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/05/2008

Support union recovery in high-dimensional multivariate regression

In multivariate regression, a K-dimensional response vector is regressed...
research
11/25/2015

L1-Regularized Least Squares for Support Recovery of High Dimensional Single Index Models with Gaussian Designs

It is known that for a certain class of single index models (SIMs) Y = f...
research
11/07/2015

Signed Support Recovery for Single Index Models in High-Dimensions

In this paper we study the support recovery problem for single index mod...
research
08/19/2022

Meta Learning for High-dimensional Ising Model Selection Using ℓ_1-regularized Logistic Regression

In this paper, we consider the meta learning problem for estimating the ...
research
08/18/2022

Meta Sparse Principal Component Analysis

We study the meta-learning for support (i.e. the set of non-zero entries...
research
09/21/2019

Sparse Group Lasso: Optimal Sample Complexity, Convergence Rate, and Statistical Inference

In this paper, we study sparse group Lasso for high-dimensional double s...
research
03/22/2022

Pattern recovery by SLOPE

LASSO and SLOPE are two popular methods for dimensionality reduction in ...

Please sign up or login with your details

Forgot password? Click here to reset