Sparse Matrix Inversion with Scaled Lasso

02/13/2012
by   Tingni Sun, et al.
0

We propose a new method of learning a sparse nonnegative-definite target matrix. Our primary example of the target matrix is the inverse of a population covariance or correlation matrix. The algorithm first estimates each column of the target matrix by the scaled Lasso and then adjusts the matrix estimator to be symmetric. The penalty level of the scaled Lasso for each column is completely determined by data via convex minimization, without using cross-validation. We prove that this scaled Lasso method guarantees the fastest proven rate of convergence in the spectrum norm under conditions of weaker form than those in the existing analyses of other ℓ_1 regularized algorithms, and has faster guaranteed rate of convergence when the ratio of the ℓ_1 and spectrum norms of the target inverse matrix diverges to infinity. A simulation study demonstrates the computational feasibility and superb performance of the proposed method. Our analysis also provides new performance bounds for the Lasso and scaled Lasso to guarantee higher concentration of the error at a smaller threshold level than previous analyses, and to allow the use of the union bound in column-by-column applications of the scaled Lasso without an adjustment of the penalty level. In addition, the least squares estimation after the scaled Lasso selection is considered and proven to guarantee performance bounds similar to that of the scaled Lasso.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/24/2011

Scaled Sparse Linear Regression

Scaled sparse linear regression jointly estimates the regression coeffic...
research
12/11/2020

Scaling positive random matrices: concentration and asymptotic convergence

It is well known that any positive matrix can be scaled to have prescrib...
research
03/28/2022

An efficient GPU-Parallel Coordinate Descent Algorithm for Sparse Precision Matrix Estimation via Scaled Lasso

The sparse precision matrix plays an essential role in the Gaussian grap...
research
09/11/2019

Insights and algorithms for the multivariate square-root lasso

We study the multivariate square-root lasso, a method for fitting the mu...
research
09/01/2020

Diagonal scalings for the eigenstructure of arbitrary pencils

In this paper we show how to construct diagonal scalings for arbitrary m...
research
07/01/2016

A scaled Bregman theorem with applications

Bregman divergences play a central role in the design and analysis of a ...
research
10/26/2020

A Homotopic Method to Solve the Lasso Problems with an Improved Upper Bound of Convergence Rate

In optimization, it is known that when the objective functions are stric...

Please sign up or login with your details

Forgot password? Click here to reset