SIMILAR: Submodular Information Measures Based Active Learning In Realistic Scenarios

07/01/2021 ∙ by Suraj Kothawade, et al. ∙ 2

Active learning has proven to be useful for minimizing labeling costs by selecting the most informative samples. However, existing active learning methods do not work well in realistic scenarios such as imbalance or rare classes, out-of-distribution data in the unlabeled set, and redundancy. In this work, we propose SIMILAR (Submodular Information Measures based actIve LeARning), a unified active learning framework using recently proposed submodular information measures (SIM) as acquisition functions. We argue that SIMILAR not only works in standard active learning, but also easily extends to the realistic settings considered above and acts as a one-stop solution for active learning that is scalable to large real-world datasets. Empirically, we show that SIMILAR significantly outperforms existing active learning algorithms by as much as  5 of out-of-distribution data on several image classification tasks like CIFAR-10, MNIST, and ImageNet.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 5

page 19

page 21

This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.