Active Learning amidst Logical Knowledge

Structured prediction is ubiquitous in applications of machine learning such as knowledge extraction and natural language processing. Structure often can be formulated in terms of logical constraints. We consider the question of how to perform efficient active learning in the presence of logical constraints among variables inferred by different classifiers. We propose several methods and provide theoretical results that demonstrate the inappropriateness of employing uncertainty guided sampling, a commonly used active learning method. Furthermore, experiments on ten different datasets demonstrate that the methods significantly outperform alternatives in practice. The results are of practical significance in situations where labeled data is scarce.

READ FULL TEXT
research
09/08/2023

Active Learning for Classifying 2D Grid-Based Level Completability

Determining the completability of levels generated by procedural generat...
research
01/01/2021

Active Learning Under Malicious Mislabeling and Poisoning Attacks

Deep neural networks usually require large labeled datasets for training...
research
10/12/2020

Pre-trained Language Model Based Active Learning for Sentence Matching

Active learning is able to significantly reduce the annotation cost for ...
research
08/31/2022

Active learning algorithm through the lens of rejection arguments

Active learning is a paradigm of machine learning which aims at reducing...
research
11/17/2019

Overcoming Practical Issues of Deep Active Learning and its Applications on Named Entity Recognition

Existing deep active learning algorithms achieve impressive sampling eff...
research
09/11/2019

On weighted uncertainty sampling in active learning

This note explores probabilistic sampling weighted by uncertainty in act...
research
05/26/2022

Deep Active Learning with Noise Stability

Uncertainty estimation for unlabeled data is crucial to active learning....

Please sign up or login with your details

Forgot password? Click here to reset