Influence Selection for Active Learning

08/20/2021
by   Zhuoming Liu, et al.
0

The existing active learning methods select the samples by evaluating the sample's uncertainty or its effect on the diversity of labeled datasets based on different task-specific or model-specific criteria. In this paper, we propose the Influence Selection for Active Learning(ISAL) which selects the unlabeled samples that can provide the most positive Influence on model performance. To obtain the Influence of the unlabeled sample in the active learning scenario, we design the Untrained Unlabeled sample Influence Calculation(UUIC) to estimate the unlabeled sample's expected gradient with which we calculate its Influence. To prove the effectiveness of UUIC, we provide both theoretical and experimental analyses. Since the UUIC just depends on the model gradients, which can be obtained easily from any neural network, our active learning algorithm is task-agnostic and model-agnostic. ISAL achieves state-of-the-art performance in different active learning settings for different tasks with different datasets. Compared with previous methods, our method decreases the annotation cost at least by 12 VOC2012 and COCO, respectively.

READ FULL TEXT

page 3

page 8

research
07/29/2021

Semi-Supervised Active Learning with Temporal Output Discrepancy

While deep learning succeeds in a wide range of tasks, it highly depends...
research
06/18/2020

On the Robustness of Active Learning

Active Learning is concerned with the question of how to identify the mo...
research
10/14/2020

Identifying Wrongly Predicted Samples: A Method for Active Learning

State-of-the-art machine learning models require access to significant a...
research
11/15/2017

Influential Sample Selection: A Graph Signal Processing Approach

With the growing complexity of machine learning techniques, understandin...
research
04/10/2020

State-Relabeling Adversarial Active Learning

Active learning is to design label-efficient algorithms by sampling the ...
research
10/29/2020

PAL : Pretext-based Active Learning

When obtaining labels is expensive, the requirement of a large labeled t...
research
12/05/2022

Dissimilar Nodes Improve Graph Active Learning

Training labels for graph embedding algorithms could be costly to obtain...

Please sign up or login with your details

Forgot password? Click here to reset