Generic Variance Bounds on Estimation and Prediction Errors in Time Series Analysis: An Entropy Perspective

04/09/2019
by   Song Fang, et al.
0

In this paper, we obtain generic bounds on the variances of estimation and prediction errors in time series analysis via an information-theoretic approach. It is seen in general that the error bounds are determined by the conditional entropy of the data point to be estimated or predicted given the side information or past observations. Additionally, we discover that in order to achieve the prediction error bounds asymptotically, the necessary and sufficient condition is that the "innovation" is asymptotically white Gaussian. When restricted to Gaussian processes and 1-step prediction, our bounds are shown to reduce to the Kolmogorov-Szegö formula and Wiener-Masani formula known from linear prediction theory.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset