Mixability made efficient: Fast online multiclass logistic regression

10/08/2021
by   Rémi Jézéquel, et al.
0

Mixability has been shown to be a powerful tool to obtain algorithms with optimal regret. However, the resulting methods often suffer from high computational complexity which has reduced their practical applicability. For example, in the case of multiclass logistic regression, the aggregating forecaster (Foster et al. (2018)) achieves a regret of O(log(Bn)) whereas Online Newton Step achieves O(e^Blog(n)) obtaining a double exponential gain in B (a bound on the norm of comparative functions). However, this high statistical performance is at the price of a prohibitive computational complexity O(n^37).

READ FULL TEXT

page 1

page 2

page 3

page 4

research
03/18/2020

Efficient improper learning for online logistic regression

We consider the setting of online logistic regression and consider the r...
research
02/26/2019

Logarithmic Regret for parameter-free Online Logistic Regression

We consider online optimization procedures in the context of logistic re...
research
10/06/2021

Efficient Methods for Online Multiclass Logistic Regression

Multiclass logistic regression is a fundamental task in machine learning...
research
01/06/2022

Jointly Efficient and Optimal Algorithms for Logistic Bandits

Logistic Bandits have recently undergone careful scrutiny by virtue of t...
research
02/11/2022

Scale-free Unconstrained Online Learning for Curved Losses

A sequence of works in unconstrained online convex optimisation have inv...
research
06/14/2023

Nearly Optimal Algorithms with Sublinear Computational Complexity for Online Kernel Regression

The trade-off between regret and computational cost is a fundamental pro...
research
03/25/2018

Logistic Regression: The Importance of Being Improper

Learning linear predictors with the logistic loss---both in stochastic a...

Please sign up or login with your details

Forgot password? Click here to reset