DeepAI AI Chat
Log In Sign Up

A nonparametric empirical Bayes approach to covariance matrix estimation

by   Huiqin Xin, et al.

We propose an empirical Bayes method to estimate high-dimensional covariance matrices. Our procedure centers on vectorizing the covariance matrix and treating matrix estimation as a vector estimation problem. Drawing from the compound decision theory literature, we introduce a new class of decision rules that generalizes several existing procedures. We then use a nonparametric empirical Bayes g-modeling approach to estimate the oracle optimal rule in that class. This allows us to let the data itself determine how best to shrink the estimator, rather than shrinking in a pre-determined direction such as toward a diagonal matrix. Simulation results and a gene expression network analysis shows that our approach can outperform a number of state-of-the-art proposals in a wide range of settings, sometimes substantially.


page 10

page 11

page 12

page 15


High dimensional discriminant rules with shrinkage estimators of covariance matrix and mean vector

Linear discriminant analysis is a typical method used in the case of lar...

Efficient nonparametric estimation of Toeplitz covariance matrices

A new nonparametric estimator for Toeplitz covariance matrices is propos...

Empirical Bayes PCA in high dimensions

When the dimension of data is comparable to or larger than the number of...

High-Dimensional Sparse Bayesian Learning without Covariance Matrices

Sparse Bayesian learning (SBL) is a powerful framework for tackling the ...

An approximate Bayes factor based high dimensional MANOVA using Random Projections

High-dimensional mean vector testing problem for two or more groups rema...

A nonparametric regression approach to asymptotically optimal estimation of normal means

Simultaneous estimation of multiple parameters has received a great deal...

Simultaneous estimation of normal means with side information

The integrative analysis of multiple datasets is an important strategy i...