Open Problem: Is There an Online Learning Algorithm That Learns Whenever Online Learning Is Possible?

07/20/2021
by   Steve Hanneke, et al.
0

This open problem asks whether there exists an online learning algorithm for binary classification that guarantees, for all target concepts, to make a sublinear number of mistakes, under only the assumption that the (possibly random) sequence of points X allows that such a learning algorithm can exist for that sequence. As a secondary problem, it also asks whether a specific concise condition completely determines whether a given (possibly random) sequence of points X admits the existence of online learning algorithms guaranteeing a sublinear number of mistakes for all target concepts.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/15/2023

Simple online learning with consistency oracle

We consider online learning in the model where a learning algorithm can ...
research
05/31/2023

Online-to-PAC Conversions: Generalization Bounds via Regret Analysis

We present a new framework for deriving bounds on the generalization bou...
research
03/03/2019

Anytime Online-to-Batch Conversions, Optimism, and Acceleration

A standard way to obtain convergence guarantees in stochastic convex opt...
research
02/05/2020

Online Passive-Aggressive Total-Error-Rate Minimization

We provide a new online learning algorithm which utilizes online passive...
research
03/09/2015

On the Intrinsic Limits to Representationally-Adaptive Machine-Learning

Online learning is a familiar problem setting within Machine-Learning in...
research
03/09/2018

SpCoSLAM 2.0: An Improved and Scalable Online Learning of Spatial Concepts and Language Models with Mapping

In this paper, we propose a novel online learning algorithm, SpCoSLAM 2....
research
05/02/2011

Rapid Learning with Stochastic Focus of Attention

We present a method to stop the evaluation of a decision making process ...

Please sign up or login with your details

Forgot password? Click here to reset