Exponential Savings in Agnostic Active Learning through Abstention

01/31/2021
by   Nikita Puchkin, et al.
0

We show that in pool-based active classification without assumptions on the underlying distribution, if the learner is given the power to abstain from some predictions by paying the price marginally smaller than the average loss 1/2 of a random guess, exponential savings in the number of label requests are possible whenever they are possible in the corresponding realizable problem. We extend this result to provide a necessary and sufficient condition for exponential savings in pool-based active classification under the model misspecification.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/08/2011

UPAL: Unbiased Pool Based Active Learning

In this paper we address the problem of pool based active learning, and ...
research
04/01/2023

An active-set based recursive approach for solving convex isotonic regression with generalized order restrictions

This paper studies the convex isotonic regression with generalized order...
research
07/16/2012

Surrogate Losses in Passive and Active Learning

Active learning is a type of sequential design for supervised machine le...
research
06/02/2023

Agnostic Multi-Group Active Learning

Inspired by the problem of improving classification accuracy on rare or ...
research
12/07/2011

Active Learning of Halfspaces under a Margin Assumption

We derive and analyze a new, efficient, pool-based active learning algor...
research
02/02/2016

Interactive algorithms: from pool to stream

We consider interactive algorithms in the pool-based setting, and in the...
research
06/29/2011

The Rate of Convergence of AdaBoost

The AdaBoost algorithm was designed to combine many "weak" hypotheses th...

Please sign up or login with your details

Forgot password? Click here to reset