Active Learning Through a Covering Lens

05/23/2022
by   Ofer Yehuda, et al.
0

Deep active learning aims to reduce the annotation cost for deep neural networks, which are notoriously data-hungry. Until recently, deep active learning methods struggled in the low-budget regime, where only a small amount of samples are annotated. The situation has been alleviated by recent advances in self-supervised representation learning methods, which impart the geometry of the data representation with rich information about the points. Taking advantage of this progress, we study the problem of subset selection for annotation through a "covering" lens, proposing ProbCover – a new active learning algorithm for the low budget regime, which seeks to maximize Probability Coverage. We describe a dual way to view our formulation, from which one can derive strategies suitable for the high budget regime of active learning, related to existing methods like Coreset. We conclude with extensive experiments, evaluating ProbCover in the low budget regime. We show that our principled active learning strategy improves the state-of-the-art in the low-budget regime in several image recognition benchmarks. This method is especially beneficial in semi-supervised settings, allowing state-of-the-art semi-supervised methods to achieve high accuracy with only a few labels.

READ FULL TEXT
research
02/06/2022

Active Learning on a Budget: Opposite Strategies Suit High and Low Budgets

Investigating active learning, we focus on the relation between the numb...
research
03/09/2022

Optical Flow Training under Limited Label Budget via Active Learning

Supervised training of optical flow predictors generally yields better a...
research
04/04/2023

Incorporating Unlabelled Data into Bayesian Neural Networks

We develop a contrastive framework for learning better prior distributio...
research
04/30/2015

Hierarchical Subquery Evaluation for Active Learning on a Graph

To train good supervised and semi-supervised object classifiers, it is c...
research
10/19/2012

Budgeted Learning of Naive-Bayes Classifiers

Frequently, acquiring training data has an associated cost. We consider ...
research
06/19/2023

Taming Small-sample Bias in Low-budget Active Learning

Active learning (AL) aims to minimize the annotation cost by only queryi...
research
06/17/2021

Gone Fishing: Neural Active Learning with Fisher Embeddings

There is an increasing need for effective active learning algorithms tha...

Please sign up or login with your details

Forgot password? Click here to reset