Generic Bounds on the Maximum Deviations in Sequential Prediction: An Information-Theoretic Analysis

10/11/2019
by   Song Fang, et al.
0

In this paper, we derive generic bounds on the maximum deviations in prediction errors for sequential prediction via an information-theoretic approach. The fundamental bounds are shown to depend only on the conditional entropy of the data point to be predicted given the previous data points. In the asymptotic case, the bounds are achieved if and only if the prediction error is white and uniformly distributed.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset