Guarantees of Total Variation Minimization for Signal Recovery

01/28/2013
by   Jian-Feng Cai, et al.
0

In this paper, we consider using total variation minimization to recover signals whose gradients have a sparse support, from a small number of measurements. We establish the proof for the performance guarantee of total variation (TV) minimization in recovering one-dimensional signal with sparse gradient support. This partially answers the open problem of proving the fidelity of total variation minimization in such a setting TVMulti. In particular, we have shown that the recoverable gradient sparsity can grow linearly with the signal dimension when TV minimization is used. Recoverable sparsity thresholds of TV minimization are explicitly computed for 1-dimensional signal by using the Grassmann angle framework. We also extend our results to TV minimization for multidimensional signals. Stability of recovering signal itself using 1-D TV minimization has also been established through a property called "almost Euclidean property for 1-dimensional TV norm". We further give a lower bound on the number of random Gaussian measurements for recovering 1-dimensional signal vectors with N elements and K-sparse gradients. Interestingly, the number of needed measurements is lower bounded by Ω((NK)^1/2), rather than the O(K(N/K)) bound frequently appearing in recovering K-sparse signal vectors.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset