Generic Bounds on the Maximum Deviations in Sequential Prediction: An Information-Theoretic Analysis

10/11/2019
by   Song Fang, et al.
0

In this paper, we derive generic bounds on the maximum deviations in prediction errors for sequential prediction via an information-theoretic approach. The fundamental bounds are shown to depend only on the conditional entropy of the data point to be predicted given the previous data points. In the asymptotic case, the bounds are achieved if and only if the prediction error is white and uniformly distributed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/09/2019

Generic Variance Bounds on Estimation and Prediction Errors in Time Series Analysis: An Entropy Perspective

In this paper, we obtain generic bounds on the variances of estimation a...
research
12/03/2019

Fundamental Limitations in Sequential Prediction and Recursive Algorithms: L_p Bounds via an Entropic Analysis

In this paper, we obtain fundamental L_p bounds in sequential prediction...
research
12/25/2018

Second-Order Converses via Reverse Hypercontractivity

A strong converse shows that no procedure can beat the asymptotic (as bl...
research
12/22/2020

Fundamental Limits on the Maximum Deviations in Control Systems: How Short Can Distribution Tails be Made by Feedback?

In this paper, we adopt an information-theoretic approach to investigate...
research
01/12/2020

Fundamental Limits of Online Learning: An Entropic-Innovations Viewpoint

In this paper, we examine the fundamental performance limitations of onl...
research
12/07/2017

How consistent is my model with the data? Information-Theoretic Model Check

The choice of model class is fundamental in statistical learning and sys...
research
12/07/2017

Is My Model Flexible Enough? Information-Theoretic Model Check

The choice of model class is fundamental in statistical learning and sys...

Please sign up or login with your details

Forgot password? Click here to reset