Information Condensing Active Learning

02/18/2020
by   Siddhartha Jain, et al.
0

We introduce Information Condensing Active Learning (ICAL), a batch mode model agnostic Active Learning (AL) method targeted at Deep Bayesian Active Learning that focuses on acquiring labels for points which have as much information as possible about the still unacquired points. ICAL uses the Hilbert Schmidt Independence Criterion (HSIC) to measure the strength of the dependency between a candidate batch of points and the unlabeled set. We develop key optimizations that allow us to scale our method to large unlabeled sets. We show significant improvements in terms of model accuracy and negative log likelihood (NLL) on several image datasets compared to state of the art batch mode AL methods for deep learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/15/2019

Discriminative Active Learning

We propose a new batch mode active learning algorithm designed for neura...
research
07/18/2023

Mining of Single-Class by Active Learning for Semantic Segmentation

Several Active Learning (AL) policies require retraining a target model ...
research
02/17/2023

Black-Box Batch Active Learning for Regression

Batch active learning is a popular approach for efficiently training mac...
research
03/17/2022

A Framework and Benchmark for Deep Batch Active Learning for Regression

We study the performance of different pool-based Batch Mode Deep Active ...
research
06/23/2017

A Variance Maximization Criterion for Active Learning

Active learning aims to train a classifier as fast as possible with as f...
research
09/07/2018

Information-Theoretic Active Learning for Content-Based Image Retrieval

We propose Information-Theoretic Active Learning (ITAL), a novel batch-m...
research
01/19/2022

Batch versus Sequential Active Learning for Recommender Systems

Recommender systems have been investigated for many years, with the aim ...

Please sign up or login with your details

Forgot password? Click here to reset