Concentration and Confidence for Discrete Bayesian Sequence Predictors

06/29/2013
by   Tor Lattimore, et al.
0

Bayesian sequence prediction is a simple technique for predicting future symbols sampled from an unknown measure on infinite sequences over a countable alphabet. While strong bounds on the expected cumulative error are known, there are only limited results on the distribution of this error. We prove tight high-probability bounds on the cumulative error, which is measured in terms of the Kullback-Leibler (KL) divergence. We also consider the problem of constructing upper confidence bounds on the KL and Hellinger errors similar to those constructed from Hoeffding-like bounds in the i.i.d. case. The new results are applied to show that Bayesian sequence prediction can be used in the Knows What It Knows (KWIK) framework with bounds that match the state-of-the-art.

READ FULL TEXT

page 1

page 2

page 3

page 4

08/13/2020

A maximum value for the Kullback-Leibler divergence between quantum discrete distributions

This work presents an upper-bound for the maximum value that the Kullbac...
09/24/2021

Tight error bounds and facial residual functions for the p-cones and beyond

We prove tight Hölderian error bounds for all p-cones. Surprisingly, the...
05/28/2018

High Probability Frequency Moment Sketches

We consider the problem of sketching the p-th frequency moment of a vect...
01/25/2020

Tight Regret Bounds for Noisy Optimization of a Brownian Motion

We consider the problem of Bayesian optimization of a one-dimensional Br...
04/21/2020

An Information-Theoretic Proof of the Streaming Switching Lemma for Symmetric Encryption

Motivated by a fundamental paradigm in cryptography, we consider a recen...
03/10/2014

Generalised Mixability, Constant Regret, and Bayesian Updating

Mixability of a loss is known to characterise when constant regret bound...
10/28/2018

On Learning Markov Chains

The problem of estimating an unknown discrete distribution from its samp...