Exp-Concavity of Proper Composite Losses

05/20/2018
by   Parameswaran Kamalaruban, et al.
0

The goal of online prediction with expert advice is to find a decision strategy which will perform almost as well as the best expert in a given pool of experts, on any sequence of outcomes. This problem has been widely studied and O(√(T)) and O(T) regret bounds can be achieved for convex losses (zinkevich2003online) and strictly convex losses with bounded first and second derivatives (hazan2007logarithmic) respectively. In special cases like the Aggregating Algorithm (vovk1995game) with mixable losses and the Weighted Average Algorithm (kivinen1999averaging) with exp-concave losses, it is possible to achieve O(1) regret bounds. van2012exp has argued that mixability and exp-concavity are roughly equivalent under certain conditions. Thus by understanding the underlying relationship between these two notions we can gain the best of both algorithms (strong theoretical performance guarantees of the Aggregating Algorithm and the computational efficiency of the Weighted Average Algorithm). In this paper we provide a complete characterization of the exp-concavity of any proper composite loss. Using this characterization and the mixability condition of proper losses (van2012mixability), we show that it is possible to transform (re-parameterize) any β-mixable binary proper loss into a β-exp-concave composite loss with the same β. In the multi-class case, we propose an approximation approach for this transformation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/17/2020

On loss functions and regret bounds for multi-category classification

We develop new approaches in multi-class settings for constructing prope...
research
12/17/2009

Composite Binary Losses

We study losses for binary classification and class probability estimati...
research
07/09/2015

Fast rates in statistical and online learning

The speed with which a learning algorithm converges as it is presented w...
research
02/20/2018

Generalized Mixability Constant Regret, Generalized Mixability, and Mirror Descent

We consider the setting of prediction with expert advice; a learner make...
research
03/10/2014

Generalised Mixability, Constant Regret, and Bayesian Updating

Mixability of a loss is known to characterise when constant regret bound...
research
06/18/2012

The Convexity and Design of Composite Multiclass Losses

We consider composite loss functions for multiclass prediction comprisin...
research
08/11/2023

On the error of best polynomial approximation of composite functions

The purpose of the paper is to provide a characterization of the error o...

Please sign up or login with your details

Forgot password? Click here to reset