Monotonicity of the Trace-Inverse of Covariance Submatrices and Two-Sided Prediction
It is common to assess the "memory strength" of a stationary process looking at how fast the normalized log-determinant of its covariance submatrices (i.e., entropy rate) decreases. In this work, we propose an alternative characterization in terms of the normalized trace-inverse of the covariance submatrices. We show that this sequence is monotonically non-decreasing and is constant if and only if the process is white. Furthermore, while the entropy rate is associated with one-sided prediction errors (present from past), the new measure is associated with two-sided prediction errors (present from past and future). This measure can be used as an alternative to Burg's maximum-entropy principle for spectral estimation. We also propose a counterpart for non-stationary processes, by looking at the average trace-inverse of subsets.
READ FULL TEXT