Extreme Compressive Sampling for Covariance Estimation
This paper studies the problem of estimating the covariance of a collection of vectors using only extremely compressed measurements of each vector. An estimator based on back-projections of these compressive samples is proposed and analyzed. A distribution-free analysis shows that by observing just a single compressive measurement of each vector, one can consistently estimate the covariance matrix, in both infinity and spectral norm, and this same analysis leads to precise rates of convergence in both norms. Via information-theoretic techniques, lower bounds showing that this estimator is minimax-optimal for both infinity and spectral norm estimation problems are established. These results are also specialized to give matching upper and lower bounds for estimating the population covariance of a collection of Gaussian vectors, again in the compressive measurement model. The analysis conducted in this paper shows that the effective sample complexity for this problem is scaled by a factor of m^2/d^2 where m is the compression dimension and d is the ambient dimension. Applications to subspace learning (Principal Components Analysis) and learning over distributed sensor networks are also discussed.
READ FULL TEXT