Actively Learning Deep Neural Networks with Uncertainty Sampling Based on Sum-Product Networks

06/20/2022
by   Mohamadsadegh Khosravani, et al.
0

Active learning is popular approach for reducing the amount of data in training deep neural network model. Its success hinges on the choice of an effective acquisition function, which ranks not yet labeled data points according to their expected informativeness. In uncertainty sampling, the uncertainty that the current model has about a point's class label is the main criterion for this type of ranking. This paper proposes a new approach to uncertainty sampling in training a Convolutional Neural Network (CNN). The main idea is to use feature representation extracted extracted by the CNN as data for training a Sum-Product Network (SPN). Since SPNs are typically used for estimating the distribution of a dataset, they are well suited to the task of estimating class probabilities that can be used directly by standard acquisition functions such as max entropy and variational ratio. Moreover, we enhance these acquisition functions by weights calculated with the help of the SPN model; these weights make the acquisition function more sensitive to the diversity of conceivable class labels for data points. The effectiveness of our method is demonstrated in an experimental study on the MNIST, Fashion-MNIST and CIFAR-10 datasets, where we compare it to the state-of-the-art methods MC Dropout and Bayesian Batch.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/08/2021

Active Learning by Acquiring Contrastive Examples

Common acquisition functions for active learning use either uncertainty ...
research
06/19/2019

BatchBALD: Efficient and Diverse Batch Acquisition for Deep Bayesian Active Learning

We develop BatchBALD, a tractable approximation to the mutual informatio...
research
01/10/2021

PowerEvaluationBALD: Efficient Evaluation-Oriented Deep (Bayesian) Active Learning with Stochastic Acquisition Functions

We develop BatchEvaluationBALD, a new acquisition function for deep Baye...
research
06/22/2021

A Simple Baseline for Batch Active Learning with Stochastic Acquisition Functions

In active learning, new labels are commonly acquired in batches. However...
research
11/05/2016

Robustly representing inferential uncertainty in deep neural networks through sampling

As deep neural networks (DNNs) are applied to increasingly challenging p...
research
03/01/2020

Deep Active Learning for Biased Datasets via Fisher Kernel Self-Supervision

Active learning (AL) aims to minimize labeling efforts for data-demandin...
research
06/08/2021

Coresets for Classification – Simplified and Strengthened

We give relative error coresets for training linear classifiers with a b...

Please sign up or login with your details

Forgot password? Click here to reset