Activized Learning: Transforming Passive to Active with Improved Label Complexity

08/08/2011
by   Steve Hanneke, et al.
0

We study the theoretical advantages of active learning over passive learning. Specifically, we prove that, in noise-free classifier learning for VC classes, any passive learning algorithm can be transformed into an active learning algorithm with asymptotically strictly superior label complexity for all nontrivial target functions and distributions. We further provide a general characterization of the magnitudes of these improvements in terms of a novel generalization of the disagreement coefficient. We also extend these results to active learning in the presence of label noise, and find that even under broad classes of noise distributions, we can typically guarantee strict improvements over the known results for passive learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/10/2021

Improved Algorithms for Efficient Active Learning Halfspaces with Massart and Tsybakov noise

We develop a computationally-efficient PAC active learning algorithm for...
research
03/31/2022

Efficient Active Learning with Abstention

The goal of active learning is to achieve the same accuracy achievable b...
research
10/03/2014

Minimax Analysis of Active Learning

This work establishes distribution-free upper and lower bounds on the mi...
research
07/16/2012

Surrogate Losses in Passive and Active Learning

Active learning is a type of sequential design for supervised machine le...
research
10/22/2014

Active Regression by Stratification

We propose a new active learning algorithm for parametric linear regress...
research
05/15/2015

An Analysis of Active Learning With Uniform Feature Noise

In active learning, the user sequentially chooses values for feature X a...
research
03/03/2017

Active Learning for Cost-Sensitive Classification

We design an active learning algorithm for cost-sensitive multiclass cla...

Please sign up or login with your details

Forgot password? Click here to reset