-
Beyond Disagreement-based Agnostic Active Learning
We study agnostic active learning, where the goal is to learn a classifi...
read it
-
On the Power of Localized Perceptron for Label-Optimal Learning of Halfspaces with Adversarial Noise
We study online active learning of homogeneous halfspaces in ℝ^d with ad...
read it
-
Search Improves Label for Active Learning
We investigate active learning with access to two distinct oracles: Labe...
read it
-
The Label Complexity of Active Learning from Observational Data
Counterfactual learning from observational data involves learning a clas...
read it
-
Learning Halfspaces With Membership Queries
Active learning is a subfield of machine learning, in which the learning...
read it
-
Robust Classification under Class-Dependent Domain Shift
Investigation of machine learning algorithms robust to changes between t...
read it
-
Active learning for binary classification with variable selection
Modern computing and communication technologies can make data collection...
read it
Active Learning under Label Shift
Distribution shift poses a challenge for active data collection in the real world. We address the problem of active learning under label shift and propose ALLS, the first framework for active learning under label shift. ALLS builds on label shift estimation techniques to correct for label shift with a balance of importance weighting and class-balanced sampling. We show a bias-variance trade-off between these two techniques and prove error and sample complexity bounds for a disagreement-based algorithm under ALLS. Experiments across a range of label shift settings demonstrate ALLS consistently improves performance, often reducing sample complexity by more than half an order of magnitude. Ablation studies corroborate the bias-variance trade-off revealed by our theory
READ FULL TEXT
Comments
There are no comments yet.