Sparse recovery with unknown variance: a LASSO-type approach
We address the issue of estimating the regression vector β in the generic s-sparse linear model y = Xβ+z, with β∈^p, y∈^n, z∼ N(0,^2 I) and p> n when the variance ^2 is unknown. We study two LASSO-type methods that jointly estimate β and the variance. These estimators are minimizers of the ℓ_1 penalized least-squares functional, where the relaxation parameter is tuned according to two different strategies. In the first strategy, the relaxation parameter is of the order σ√( p), where σ^2 is the empirical variance. running only a few successive LASSO instances with relaxation parameter. In the second strategy, the relaxation parameter is chosen so as to enforce a trade-off between the fidelity and the penalty terms at optimality. For both estimators, our assumptions are similar to the ones proposed by Candès and Plan in Ann. Stat. (2009), for the case where ^2 is known. We prove that our estimators ensure exact recovery of the support and sign pattern of β with high probability. We present simulations results showing that the first estimator enjoys nearly the same performances in practice as the standard LASSO (known variance case) for a wide range of the signal to noise ratio. Our second estimator is shown to outperform both in terms of false detection, when the signal to noise ratio is low.
READ FULL TEXT