Log In Sign Up

Monotonicity of the Trace-Inverse of Covariance Submatrices and Two-Sided Prediction

by   Anatoly Khina, et al.

It is common to assess the "memory strength" of a stationary process looking at how fast the normalized log-determinant of its covariance submatrices (i.e., entropy rate) decreases. In this work, we propose an alternative characterization in terms of the normalized trace-inverse of the covariance submatrices. We show that this sequence is monotonically non-decreasing and is constant if and only if the process is white. Furthermore, while the entropy rate is associated with one-sided prediction errors (present from past), the new measure is associated with two-sided prediction errors (present from past and future). This measure can be used as an alternative to Burg's maximum-entropy principle for spectral estimation. We also propose a counterpart for non-stationary processes, by looking at the average trace-inverse of subsets.


page 1

page 2

page 3

page 4


Stationarity in the Realizations of the Causal Rate-Distortion Function for One-Sided Stationary Sources

This paper derives novel results on the characterization of the the caus...

Renyi Entropy Rate of Stationary Ergodic Processes

In this paper, we examine the Renyi entropy rate of stationary ergodic p...

Optimally adaptive Bayesian spectral density estimation

This paper studies spectral density estimates obtained assuming a Gaussi...

Bures-Hall Ensemble: Spectral Densities and Average Entropies

We consider an ensemble of random density matrices distributed according...

Information theory for non-stationary processes with stationary increments

We describe how to analyze the wide class of non stationary processes wi...

Exact spectral norm error of sample covariance

Let X_1,…,X_n be i.i.d. centered Gaussian vectors in ℝ^p with covariance...

A Frequency-Domain Characterization of Optimal Error Covariance for the Kalman-Bucy Filter

In this paper, we discover that the trace of the division of the optimal...