Optimizing Active Learning for Low Annotation Budgets

01/18/2022
by   Umang Aggarwal, et al.
11

When we can not assume a large amount of annotated data , active learning is a good strategy. It consists in learning a model on a small amount of annotated data (annotation budget) and in choosing the best set of points to annotate in order to improve the previous model and gain in generalization. In deep learning, active learning is usually implemented as an iterative process in which successive deep models are updated via fine tuning, but it still poses some issues. First, the initial batch of annotated images has to be sufficiently large to train a deep model. Such an assumption is strong, especially when the total annotation budget is reduced. We tackle this issue by using an approach inspired by transfer learning. A pre-trained model is used as a feature extractor and only shallow classifiers are learned during the active iterations. The second issue is the effectiveness of probability or feature estimates of early models for AL task. Samples are generally selected for annotation using acquisition functions based only on the last learned model. We introduce a novel acquisition function which exploits the iterative nature of AL process to select samples in a more robust fashion. Samples for which there is a maximum shift towards uncertainty between the last two learned models predictions are favored. A diversification step is added to select samples from different regions of the classification space and thus introduces a representativeness component in our approach. Evaluation is done against competitive methods with three balanced and imbalanced datasets and outperforms them.

READ FULL TEXT
research
02/01/2022

Minority Class Oriented Active Learning for Imbalanced Datasets

Active learning aims to optimize the dataset annotation process when res...
research
06/07/2020

How useful is Active Learning for Image-based Plant Phenotyping?

Deep learning models have been successfully deployed for a diverse array...
research
04/18/2022

Active Learning with Weak Labels for Gaussian Processes

Annotating data for supervised learning can be costly. When the annotati...
research
05/07/2022

Towards Computationally Feasible Deep Active Learning

Active learning (AL) is a prominent technique for reducing the annotatio...
research
11/21/2022

Can You Label Less by Using Out-of-Domain Data? Active Transfer Learning with Few-shot Instructions

Labeling social-media data for custom dimensions of toxicity and social ...
research
01/26/2021

Adversarial Vulnerability of Active Transfer Learning

Two widely used techniques for training supervised machine learning mode...
research
10/02/2021

Automated Seed Quality Testing System using GAN Active Learning

Quality assessment of agricultural produce is a crucial step in minimizi...

Please sign up or login with your details

Forgot password? Click here to reset