Making Look-Ahead Active Learning Strategies Feasible with Neural Tangent Kernels

06/25/2022
by   Mohamad Amin Mohamadi, et al.
0

We propose a new method for approximating active learning acquisition strategies that are based on retraining with hypothetically-labeled candidate data points. Although this is usually infeasible with deep networks, we use the neural tangent kernel to approximate the result of retraining, and prove that this approximation works asymptotically even in an active learning setup – approximating "look-ahead" selection criteria with far less computation required. This also enables us to conduct sequential active learning, i.e. updating the model in a streaming regime, without needing to retrain the model with SGD after adding each new data point. Moreover, our querying strategy, which better understands how the model's predictions will change by adding new data points in comparison to the standard ("myopic") criteria, beats other look-ahead strategies by large margins, and achieves equal or better performance compared to state-of-the-art methods on several benchmark datasets in pool-based active learning.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/22/2020

Pool-based sequential active learning with multi kernels

We study a pool-based sequential active learning (AL), in which one samp...
research
09/08/2021

Active Learning by Acquiring Contrastive Examples

Common acquisition functions for active learning use either uncertainty ...
research
03/18/2022

Look-Ahead Acquisition Functions for Bernoulli Level Set Estimation

Level set estimation (LSE) is the problem of identifying regions where a...
research
11/09/2020

LADA: Look-Ahead Data Acquisition via Augmentation for Active Learning

Active learning effectively collects data instances for training deep le...
research
06/08/2021

A critical look at the current train/test split in machine learning

The randomized or cross-validated split of training and testing sets has...
research
03/17/2022

A Framework and Benchmark for Deep Batch Active Learning for Regression

We study the performance of different pool-based Batch Mode Deep Active ...
research
05/23/2022

PyRelationAL: A Library for Active Learning Research and Development

In constrained real-world scenarios where it is challenging or costly to...

Please sign up or login with your details

Forgot password? Click here to reset