Active Learning amidst Logical Knowledge

09/26/2017
by   Emmanouil Antonios Platanios, et al.
0

Structured prediction is ubiquitous in applications of machine learning such as knowledge extraction and natural language processing. Structure often can be formulated in terms of logical constraints. We consider the question of how to perform efficient active learning in the presence of logical constraints among variables inferred by different classifiers. We propose several methods and provide theoretical results that demonstrate the inappropriateness of employing uncertainty guided sampling, a commonly used active learning method. Furthermore, experiments on ten different datasets demonstrate that the methods significantly outperform alternatives in practice. The results are of practical significance in situations where labeled data is scarce.

READ FULL TEXT
01/01/2021

Active Learning Under Malicious Mislabeling and Poisoning Attacks

Deep neural networks usually require large labeled datasets for training...
10/12/2020

Pre-trained Language Model Based Active Learning for Sentence Matching

Active learning is able to significantly reduce the annotation cost for ...
08/31/2022

Active learning algorithm through the lens of rejection arguments

Active learning is a paradigm of machine learning which aims at reducing...
08/16/2018

Deep Bayesian Active Learning for Natural Language Processing: Results of a Large-Scale Empirical Study

Several recent papers investigate Active Learning (AL) for mitigating th...
06/15/2018

On the Relationship between Data Efficiency and Error for Uncertainty Sampling

While active learning offers potential cost savings, the actual data eff...
11/17/2019

Overcoming Practical Issues of Deep Active Learning and its Applications on Named Entity Recognition

Existing deep active learning algorithms achieve impressive sampling eff...
03/30/2021

Towards Active Learning Based Smart Assistant for Manufacturing

A general approach for building a smart assistant that guides a user fro...