Scaled Sparse Linear Regression

04/24/2011
by   Tingni Sun, et al.
0

Scaled sparse linear regression jointly estimates the regression coefficients and noise level in a linear model. It chooses an equilibrium with a sparse regression method by iteratively estimating the noise level via the mean residual square and scaling the penalty in proportion to the estimated noise level. The iterative algorithm costs little beyond the computation of a path or grid of the sparse regression estimator for penalty levels above a proper threshold. For the scaled lasso, the algorithm is a gradient descent in a convex minimization of a penalized joint loss function for the regression coefficients and noise level. Under mild regularity conditions, we prove that the scaled lasso simultaneously yields an estimator for the noise level and an estimated coefficient vector satisfying certain oracle inequalities for prediction, the estimation of the noise level and the regression coefficients. These inequalities provide sufficient conditions for the consistency and asymptotic normality of the noise level estimator, including certain cases where the number of variables is of greater order than the sample size. Parallel results are provided for the least squares estimation after model selection by the scaled lasso. Numerical results demonstrate the superior performance of the proposed methods over an earlier proposal of joint convex minimization.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/27/2017

Generalized Concomitant Multi-Task Lasso for sparse multimodal regression

In high dimension, it is customary to consider Lasso-type estimators to ...
research
02/13/2012

Sparse Matrix Inversion with Scaled Lasso

We propose a new method of learning a sparse nonnegative-definite target...
research
10/16/2010

Exact block-wise optimization in group lasso and sparse group lasso for linear regression

The group lasso is a penalized regression method, used in regression pro...
research
03/18/2021

Robust-to-outliers square-root LASSO, simultaneous inference with a MOM approach

We consider the least-squares regression problem with unknown noise vari...
research
04/09/2020

Sparse recovery of noisy data using the Lasso method

We present a detailed analysis of the unconstrained ℓ_1-method Lasso met...
research
11/12/2021

Distributed Sparse Regression via Penalization

We study sparse linear regression over a network of agents, modeled as a...
research
11/03/2010

The Lasso under Heteroscedasticity

The performance of the Lasso is well understood under the assumptions of...

Please sign up or login with your details

Forgot password? Click here to reset