Dictionary LASSO: Guaranteed Sparse Recovery under Linear Transformation
We consider the following signal recovery problem: given a measurement matrix Φ∈R^n× p and a noisy observation vector c∈R^n constructed from c = Φθ^* + ϵ where ϵ∈R^n is the noise vector whose entries follow i.i.d. centered sub-Gaussian distribution, how to recover the signal θ^* if Dθ^* is sparse under a linear transformation D∈R^m× p? One natural method using convex optimization is to solve the following problem: _θ1 2Φθ - c^2 + λDθ_1. This paper provides an upper bound of the estimate error and shows the consistency property of this method by assuming that the design matrix Φ is a Gaussian random matrix. Specifically, we show 1) in the noiseless case, if the condition number of D is bounded and the measurement number n≥Ω(s(p)) where s is the sparsity number, then the true solution can be recovered with high probability; and 2) in the noisy case, if the condition number of D is bounded and the measurement increases faster than s(p), that is, s(p)=o(n), the estimate error converges to zero with probability 1 when p and s go to infinity. Our results are consistent with those for the special case D=I_p× p (equivalently LASSO) and improve the existing analysis. The condition number of D plays a critical role in our analysis. We consider the condition numbers in two cases including the fused LASSO and the random graph: the condition number in the fused LASSO case is bounded by a constant, while the condition number in the random graph case is bounded with high probability if m p (i.e., #textedge #textvertex) is larger than a certain constant. Numerical simulations are consistent with our theoretical results.
READ FULL TEXT