Log In Sign Up

SIMILAR: Submodular Information Measures Based Active Learning In Realistic Scenarios

by   Suraj Kothawade, et al.

Active learning has proven to be useful for minimizing labeling costs by selecting the most informative samples. However, existing active learning methods do not work well in realistic scenarios such as imbalance or rare classes, out-of-distribution data in the unlabeled set, and redundancy. In this work, we propose SIMILAR (Submodular Information Measures based actIve LeARning), a unified active learning framework using recently proposed submodular information measures (SIM) as acquisition functions. We argue that SIMILAR not only works in standard active learning, but also easily extends to the realistic settings considered above and acts as a one-stop solution for active learning that is scalable to large real-world datasets. Empirically, we show that SIMILAR significantly outperforms existing active learning algorithms by as much as  5 of out-of-distribution data on several image classification tasks like CIFAR-10, MNIST, and ImageNet.


page 1

page 5

page 19

page 21


Active Data Discovery: Mining Unknown Data using Submodular Information Measures

Active Learning is a very common yet powerful framework for iteratively ...

Class-Balanced Active Learning for Image Classification

Active learning aims to reduce the labeling effort that is required to t...

CLINICAL: Targeted Active Learning for Imbalanced Medical Image Classification

Training deep learning models on medical datasets that perform well for ...

Single Shot Active Learning using Pseudo Annotators

Standard myopic active learning assumes that human annotations are alway...

Active Learning for Deep Neural Networks on Edge Devices

When dealing with deep neural network (DNN) applications on edge devices...

VaB-AL: Incorporating Class Imbalance and Difficulty with Variational Bayes for Active Learning

Active Learning for discriminative models has largely been studied with ...

DIAGNOSE: Avoiding Out-of-distribution Data using Submodular Information Measures

Avoiding out-of-distribution (OOD) data is critical for training supervi...