DeepAI AI Chat
Log In Sign Up

Exponential Savings in Agnostic Active Learning through Abstention

by   Nikita Puchkin, et al.

We show that in pool-based active classification without assumptions on the underlying distribution, if the learner is given the power to abstain from some predictions by paying the price marginally smaller than the average loss 1/2 of a random guess, exponential savings in the number of label requests are possible whenever they are possible in the corresponding realizable problem. We extend this result to provide a necessary and sufficient condition for exponential savings in pool-based active classification under the model misspecification.


page 1

page 2

page 3

page 4


UPAL: Unbiased Pool Based Active Learning

In this paper we address the problem of pool based active learning, and ...

An active-set based recursive approach for solving convex isotonic regression with generalized order restrictions

This paper studies the convex isotonic regression with generalized order...

Surrogate Losses in Passive and Active Learning

Active learning is a type of sequential design for supervised machine le...

Agnostic Multi-Group Active Learning

Inspired by the problem of improving classification accuracy on rare or ...

Active Learning of Halfspaces under a Margin Assumption

We derive and analyze a new, efficient, pool-based active learning algor...

Interactive algorithms: from pool to stream

We consider interactive algorithms in the pool-based setting, and in the...

The Rate of Convergence of AdaBoost

The AdaBoost algorithm was designed to combine many "weak" hypotheses th...