Improving Robustness and Efficiency in Active Learning with Contrastive Loss

09/13/2021
by   Ranganath Krishnan, et al.
0

This paper introduces supervised contrastive active learning (SCAL) by leveraging the contrastive loss for active learning in a supervised setting. We propose efficient query strategies in active learning to select unbiased and informative data samples of diverse feature representations. We demonstrate our proposed method reduces sampling bias, achieves state-of-the-art accuracy and model calibration in an active learning setup with the query computation 11x faster than CoreSet and 26x faster than Bayesian active learning by disagreement. Our method yields well-calibrated models even with imbalanced datasets. We also evaluate robustness to dataset shift and out-of-distribution in active learning setup and demonstrate our proposed SCAL method outperforms high performing compute-intensive methods by a bigger margin (average 8.9 higher AUROC for out-of-distribution detection and average 7.2 dataset shift).

READ FULL TEXT

page 3

page 6

page 13

page 14

research
09/13/2021

Mitigating Sampling Bias and Improving Robustness in Active Learning

This paper presents simple and efficient methods to mitigate sampling bi...
research
01/23/2022

Partition-Based Active Learning for Graph Neural Networks

We study the problem of semi-supervised learning with Graph Neural Netwo...
research
10/05/2022

Making Your First Choice: To Address Cold Start Problem in Vision Active Learning

Active learning promises to improve annotation efficiency by iteratively...
research
03/24/2020

A Data-Efficient Sampling Method for Estimating Basins of Attraction Using Hybrid Active Learning (HAL)

Although basins of attraction (BoA) diagrams are an insightful tool for ...
research
10/13/2022

Meta-Query-Net: Resolving Purity-Informativeness Dilemma in Open-set Active Learning

Unlabeled data examples awaiting annotations contain open-set noise inev...
research
07/16/2020

Active Learning under Label Shift

Distribution shift poses a challenge for active data collection in the r...
research
09/21/2023

TalkNCE: Improving Active Speaker Detection with Talk-Aware Contrastive Learning

The goal of this work is Active Speaker Detection (ASD), a task to deter...

Please sign up or login with your details

Forgot password? Click here to reset