Theoretical links between universal and Bayesian compressed sensing algorithms
Quantized maximum a posteriori (Q-MAP) is a recently-proposed Bayesian compressed sensing algorithm that, given the source distribution, recovers X^n from its linear measurements Y^m=AX^n, where A∈ R^m× n denotes the known measurement matrix. On the other hand, Lagrangian minimum entropy pursuit (L-MEP) is a universal compressed sensing algorithm that aims at recovering X^n from its linear measurements Y^m=AX^n, without having access to the source distribution. Both Q-MAP and L-MEP provably achieve the minimum required sampling rates, in noiseless cases where such fundamental limits are known. L-MEP is based on minimizing a cost function that consists of a linear combination of the conditional empirical entropy of a potential reconstruction vector and its corresponding measurement error. In this paper, using a first-order linear approximation of the conditional empirical entropy function, L-MEP is connected with Q-MAP. The established connection between L-MEP and Q-MAP leads to variants of Q-MAP which have the same asymptotic performance as Q-MAP in terms of their required sampling rates. Moreover, these variants suggest that Q-MAP is robust to small error in estimating the source distribution. This robustness is theoretically proven and the effect of a non-vanishing estimation error on the required sampling rate is characterized.
READ FULL TEXT