Improving the condition number of estimated covariance matrices

10/25/2018
by   Jemima M. Tabeart, et al.
0

High dimensional error covariance matrices are used to weight the contribution of observation and background terms in data assimilation procedures. As error covariance matrices are often obtained by sampling methods, the resulting matrices are often degenerate or ill-conditioned, making them too expensive to use in practice. In order to combat these problems, reconditioning methods are used. In this paper we present new theory for two existing methods that can be used to reduce the condition number of (or 'recondition') any covariance matrix: ridge regression, and the minimum eigenvalue method. These methods are used in practice at numerical weather prediction centres, but their theoretical impact on the covariance matrix itself is not well understood. Here we address this by investigating the impact of reconditioning on variances and covariances of a general covariance matrix in both a theoretical and practical setting. Improved theoretical understanding provides guidance to users with respect to both method selection, and choice of target condition number. The new theory shows that, for the same target condition number, both methods increase variances compared to the original matrix, and that the ridge regression method results in a larger increase to the variances compared to the original matrix than the minimum eigenvalue method for any covariance matrix. We also prove that the ridge regression method strictly decreases the absolute value of off-diagonal correlations. We apply the reconditioning methods to two examples: a simple general correlation function, and an error covariance matrix arising from interchannel correlations. The minimum eigenvalue method results in smaller overall changes to the correlation matrix than the ridge regression method, but in contrast can increase off-diagonal correlations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/09/2022

EM algorithm for generalized Ridge regression with spatial covariates

The generalized Ridge penalty is a powerful tool for dealing with overfi...
research
08/16/2020

An efficient numerical method for condition number constrained covariance matrix approximation

In high-dimensional data setting, the sample covariance matrix is singul...
research
06/06/2023

Entropic covariance models

In covariance matrix estimation, one of the challenges lies in finding a...
research
10/17/2018

Optimal Covariance Estimation for Condition Number Loss in the Spiked Model

We study estimation of the covariance matrix under relative condition nu...
research
08/14/2016

The Spectral Condition Number Plot for Regularization Parameter Determination

Many modern statistical applications ask for the estimation of a covaria...
research
12/05/2022

Impact of correlated observation errors on the convergence of the conjugate gradient algorithm in variational data assimilation

An important class of nonlinear weighted least-squares problems arises f...
research
06/20/2022

Statistical Dependence Analyses of Operational Flight Data Used for Landing Reconstruction Enhancement

The RTS smoother is widely used for state estimation and it is utilized ...

Please sign up or login with your details

Forgot password? Click here to reset