Algorithmic Complexity Bounds on Future Prediction Errors

01/19/2007
by   A. Chernov, et al.
0

We bound the future loss when predicting any (computably) stochastic sequence online. Solomonoff finitely bounded the total deviation of his universal predictor M from the true distribution mu by the algorithmic complexity of mu. Here we assume we are at a time t>1 and already observed x=x_1...x_t. We bound the future prediction performance on x_t+1x_t+2... by a new variant of algorithmic complexity of mu given x, plus the complexity of the randomness deficiency of x. The new complexity is monotone in its condition in the sense that this complexity can only decrease if the condition is prolonged. We also briefly discuss potential generalizations to Bayesian model classes and to classification problems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/21/2023

Forecast Ergodicity: Prediction Modeling Using Algorithmic Information Theory

The capabilities of machine intelligence are bounded by the potential of...
research
06/26/2020

Relative Deviation Margin Bounds

We present a series of new and more favorable margin-based learning guar...
research
04/09/2022

Faster Min-Plus Product for Monotone Instances

In this paper, we show that the time complexity of monotone min-plus pro...
research
10/21/2017

A Tight Excess Risk Bound via a Unified PAC-Bayesian-Rademacher-Shtarkov-MDL Complexity

We present a novel notion of complexity that interpolates between and ge...
research
12/13/1999

New Error Bounds for Solomonoff Prediction

Solomonoff sequence prediction is a scheme to predict digits of binary s...
research
07/10/2019

On the Complexity of Completing Binary Predicates

Given a binary predicate P, the length of the smallest program that comp...
research
09/11/2007

On Universal Prediction and Bayesian Confirmation

The Bayesian framework is a well-studied and successful framework for in...

Please sign up or login with your details

Forgot password? Click here to reset