Dual Active Sampling on Batch-Incremental Active Learning

05/22/2019
by   Johan Phan, et al.
0

Recently, Convolutional Neural Networks (CNNs) have shown unprecedented success in the field of computer vision, especially on challenging image classification tasks by relying on a universal approach, i.e., training a deep model on a massive dataset of supervised examples. While unlabeled data are often an abundant resource, collecting a large set of labeled data, on the other hand, are very expensive, which often require considerable human efforts. One way to ease out this is to effectively select and label highly informative instances from a pool of unlabeled data (i.e., active learning). This paper proposed a new method of batch-mode active learning, Dual Active Sampling(DAS), which is based on a simple assumption, if two deep neural networks (DNNs) of the same structure and trained on the same dataset give significantly different output for a given sample, then that particular sample should be picked for additional training. While other state of the art methods in this field usually require intensive computational power or relying on a complicated structure, DAS is simpler to implement and, managed to get improved results on Cifar-10 with preferable computational time compared to the core-set method.

READ FULL TEXT

page 5

page 6

research
07/15/2019

Discriminative Active Learning

We propose a new batch mode active learning algorithm designed for neura...
research
08/01/2017

Active Learning for Convolutional Neural Networks: A Core-Set Approach

Convolutional neural networks (CNNs) have been successfully applied to m...
research
07/22/2020

DEAL: Deep Evidential Active Learning for Image Classification

Convolutional Neural Networks (CNNs) have proven to be state-of-the-art ...
research
11/27/2020

Active Learning in CNNs via Expected Improvement Maximization

Deep learning models such as Convolutional Neural Networks (CNNs) have d...
research
11/09/2018

Deep Ensemble Bayesian Active Learning : Addressing the Mode Collapse issue in Monte Carlo dropout via Ensembles

In image classification tasks, the ability of deep CNNs to deal with com...
research
02/08/2022

A Lagrangian Duality Approach to Active Learning

We consider the batch active learning problem, where only a subset of th...
research
08/26/2021

Consistent Relative Confidence and Label-Free Model Selection for Convolutional Neural Networks

This paper is concerned with image classification based on deep convolut...

Please sign up or login with your details

Forgot password? Click here to reset