Sequential prediction under log-loss with side information

02/13/2021
by   Alankrita Bhatt, et al.
0

The problem of online prediction with sequential side information under logarithmic loss is studied, and general upper and lower bounds on the minimax regret incurred by the predictor is established. The upper bounds on the minimax regret are obtained by providing and analyzing a probability assignment inspired by mixture probability assignments in universal compression, and the lower bounds are obtained by way of a redundancy-capacity theorem. The tight characterization of the regret is provided in some special settings.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/07/2022

Precise Regret Bounds for Log-loss via a Truncated Bayesian Algorithm

We study the sequential general online regression, known also as the seq...
research
09/09/2022

Expected Worst Case Regret via Stochastic Sequential Covering

We study the problem of sequential prediction and online minimax regret ...
research
01/29/2015

Sequential Probability Assignment with Binary Alphabets and Large Classes of Experts

We analyze the problem of sequential probability assignment for binary o...
research
02/07/2020

Logistic Regression Regret: What's the Catch?

We address the problem of the achievable regret rates with online logist...
research
10/27/2015

Online Learning with Gaussian Payoffs and Side Observations

We consider a sequential learning problem with Gaussian payoffs and side...
research
10/09/2018

Adaptive Minimax Regret against Smooth Logarithmic Losses over High-Dimensional ℓ_1-Balls via Envelope Complexity

We develop a new theoretical framework, the envelope complexity, to anal...
research
03/08/2023

Smoothed Analysis of Sequential Probability Assignment

We initiate the study of smoothed analysis for the sequential probabilit...

Please sign up or login with your details

Forgot password? Click here to reset