Temporal Output Discrepancy for Loss Estimation-based Active Learning

12/20/2022
by   Siyu Huang, et al.
0

While deep learning succeeds in a wide range of tasks, it highly depends on the massive collection of annotated data which is expensive and time-consuming. To lower the cost of data annotation, active learning has been proposed to interactively query an oracle to annotate a small proportion of informative samples in an unlabeled dataset. Inspired by the fact that the samples with higher loss are usually more informative to the model than the samples with lower loss, in this paper we present a novel deep active learning approach that queries the oracle for data annotation when the unlabeled sample is believed to incorporate high loss. The core of our approach is a measurement Temporal Output Discrepancy (TOD) that estimates the sample loss by evaluating the discrepancy of outputs given by models at different optimization steps. Our theoretical investigation shows that TOD lower-bounds the accumulated sample loss thus it can be used to select informative unlabeled samples. On basis of TOD, we further develop an effective unlabeled data sampling strategy as well as an unsupervised learning criterion for active learning. Due to the simplicity of TOD, our methods are efficient, flexible, and task-agnostic. Extensive experimental results demonstrate that our approach achieves superior performances than the state-of-the-art active learning methods on image classification and semantic segmentation tasks. In addition, we show that TOD can be utilized to select the best model of potentially the highest testing accuracy from a pool of candidate models.

READ FULL TEXT

page 10

page 15

research
07/29/2021

Semi-Supervised Active Learning with Temporal Output Discrepancy

While deep learning succeeds in a wide range of tasks, it highly depends...
research
07/04/2020

Deep Active Learning via Open Set Recognition

In many applications, data is easy to acquire but expensive and time con...
research
11/26/2021

Active Learning for Event Extraction with Memory-based Loss Prediction Model

Event extraction (EE) plays an important role in many industrial applica...
research
03/25/2023

Deep Active Learning with Contrastive Learning Under Realistic Data Pool Assumptions

Active learning aims to identify the most informative data from an unlab...
research
07/23/2020

Deep Active Learning by Model Interpretability

Recent successes of Deep Neural Networks (DNNs) in a variety of research...
research
06/23/2017

A Variance Maximization Criterion for Active Learning

Active learning aims to train a classifier as fast as possible with as f...
research
10/14/2020

Identifying Wrongly Predicted Samples: A Method for Active Learning

State-of-the-art machine learning models require access to significant a...

Please sign up or login with your details

Forgot password? Click here to reset