Bayesian inference for spectral projectors of covariance matrix

11/30/2017
by   Igor Silin, et al.
0

Let X_1, ..., X_n be i.i.d. sample in R^p with zero mean and the covariance matrix Σ^*. The classic principal component analysis estimates the projector P^*_J onto the direct sum of some eigenspaces of Σ^* by its empirical counterpart P_J. Recent papers [Koltchinskii, Lounici (2017)], [Naumov et al. (2017)] investigate the asymptotic distribution of the Frobenius distance between the projectors P_J - P^*_J_2. The problem arises when one tries to build a confidence set for the true projector effectively. We consider the problem from Bayesian perspective and derive an approximation for the posterior distribution of the Frobenius distance between projectors. The derived theorems hold true for non-Gaussian data: the only assumption that we impose is the concentration of the sample covariance Σ in a vicinity of Σ^*. The obtained results are applied to construction of sharp confidence sets for the true projector. Numerical simulations illustrate good performance of the proposed procedure even on non-Gaussian data in quite challenging regime.

READ FULL TEXT

Authors

page 1

page 2

page 3

page 4

12/10/2017

Finite sample Bernstein - von Mises theorems for functionals and spectral projectors of covariance matrix

We demonstrate that a prior influence on the posterior distribution of c...
09/17/2018

Recovering the Underlying Trajectory from Sparse and Irregular Longitudinal Data

In this article, we consider the problem of recovering the underlying tr...
12/14/2021

Euclid: Covariance of weak lensing pseudo-C_ℓ estimates. Calculation, comparison to simulations, and dependence on survey geometry

An accurate covariance matrix is essential for obtaining reliable cosmol...
06/28/2021

Bootstrapping the error of Oja's Algorithm

We consider the problem of quantifying uncertainty for the estimation er...
01/29/2019

Blind Unwrapping of Modulo Reduced Gaussian Vectors: Recovering MSBs from LSBs

We consider the problem of recovering n i.i.d samples from a zero mean m...
07/30/2020

Localizing differences in smooths with simultaneous confidence bounds on the true discovery proportion

We demonstrate a method for localizing where two smooths differ using a ...
05/23/2016

Sub-Gaussian estimators of the mean of a random matrix with heavy-tailed entries

Estimation of the covariance matrix has attracted a lot of attention of ...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.