Insights and algorithms for the multivariate square-root lasso

09/11/2019
by   Aaron J. Molstad, et al.
0

We study the multivariate square-root lasso, a method for fitting the multivariate response linear regression model with dependent errors. This estimator minimizes the nuclear norm of the residual matrix plus a convex penalty. Unlike some existing methods for multivariate response linear regression, which require explicit estimates of the error covariance matrix or its inverse, the multivariate square-root lasso criterion implicitly adapts to dependent errors and is convex. To justify the use of this estimator, we establish an error bound which illustrates that like the univariate square-root lasso (Belloni et al., 2011), the multivariate square-root lasso is pivotal with respect to the unknown error covariance matrix. Based on our theory, we propose a simple tuning approach which requires fitting the model for only a single value of the tuning parameter, i.e., does not require cross-validation. We propose two algorithms to compute the estimator: a prox-linear alternating direction method of multipliers algorithm, and a fast first order algorithm which can be applied in special cases. In both simulation studies and a real data application, we show that the multivariate square-root lasso can outperform more computationally intensive methods which estimate both the regression coefficient matrix and error precision matrix.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/12/2022

Optimal tuning-free convex relaxation for noisy matrix completion

This paper is concerned with noisy matrix completion–the problem of reco...
research
08/31/2018

An explicit mean-covariance parameterization for multivariate response linear regression

We develop a new method to fit the multivariate response linear regressi...
research
03/11/2020

An Improved Square-root Algorithm for V-BLAST Based on Efficient Inverse Cholesky Factorization

A fast algorithm for inverse Cholesky factorization is proposed, to comp...
research
04/02/2014

Don't Fall for Tuning Parameters: Tuning-Free Variable Selection in High Dimensions With the TREX

Lasso is a seminal contribution to high-dimensional statistics, but it h...
research
11/14/2022

On the generalization error of norm penalty linear regression models

We study linear regression problems inf_β∈ℝ^d(𝔼_ℙ_n[|Y - 𝐗^⊤β|^r])^1/r +...
research
02/13/2012

Sparse Matrix Inversion with Scaled Lasso

We propose a new method of learning a sparse nonnegative-definite target...
research
01/19/2019

Tuning parameter selection rules for nuclear norm regularized multivariate linear regression

We consider the tuning parameter selection rules for nuclear norm regula...

Please sign up or login with your details

Forgot password? Click here to reset