Covariance Estimation in High Dimensions via Kronecker Product Expansions

02/12/2013
by   Theodoros Tsiligkaridis, et al.
0

This paper presents a new method for estimating high dimensional covariance matrices. The method, permuted rank-penalized least-squares (PRLS), is based on a Kronecker product series expansion of the true covariance matrix. Assuming an i.i.d. Gaussian random sample, we establish high dimensional rates of convergence to the true covariance as both the number of samples and the number of variables go to infinity. For covariance matrices of low separation rank, our results establish that PRLS has significantly faster convergence than the standard sample covariance matrix (SCM) estimator. The convergence rate captures a fundamental tradeoff between estimation error and approximation error, thus providing a scalable covariance estimation framework in terms of separation rank, similar to low rank approximation of covariance matrices. The MSE convergence rates generalize the high dimensional rates recently obtained for the ML Flip-flop algorithm for Kronecker product covariance estimation. We show that a class of block Toeplitz covariance matrices is approximatable by low separation rank and give bounds on the minimal separation rank r that ensures a given level of bias. Simulations are presented to validate the theoretical bounds. As a real world application, we illustrate the utility of the proposed Kronecker covariance estimator for spatio-temporal linear least squares prediction of multivariate wind speed measurements.

READ FULL TEXT

page 18

page 22

page 27

research
04/03/2012

Convergence Properties of Kronecker Graphical Lasso Algorithms

This paper studies iteration convergence of Kronecker graphical lasso (K...
research
07/27/2013

Kronecker Sum Decompositions of Space-Time Data

In this paper we consider the use of the space vs. time Kronecker produc...
research
10/30/2017

Sparse covariance matrix estimation in high-dimensional deconvolution

We study the estimation of the covariance matrix Σ of a p-dimensional no...
research
04/25/2020

Low-rank multi-parametric covariance identification

We propose a differential geometric construction for families of low-ran...
research
06/20/2023

The Conditioning of Hybrid Variational Data Assimilation

In variational assimilation, the most probable state of a dynamical syst...
research
01/01/2017

High Dimensional Multi-Level Covariance Estimation and Kriging

With the advent of big data sets much of the computational science and e...
research
04/08/2018

Moving Beyond Sub-Gaussianity in High-Dimensional Statistics: Applications in Covariance Estimation and Linear Regression

Concentration inequalities form an essential toolkit in the study of hig...

Please sign up or login with your details

Forgot password? Click here to reset