Disambiguation of weak supervision with exponential convergence rates

02/04/2021
by   Vivien Cabannes, et al.
0

Machine learning approached through supervised learning requires expensive annotation of data. This motivates weakly supervised learning, where data are annotated with incomplete yet discriminative information. In this paper, we focus on partial labelling, an instance of weak supervision where, from a given input, we are given a set of potential targets. We review a disambiguation principle to recover full supervision from weak supervision, and propose an empirical disambiguation algorithm. We prove exponential convergence rates of our algorithm under classical learnability assumptions, and we illustrate the usefulness of our method on practical examples.

READ FULL TEXT
research
09/15/2020

Constrained Labeling for Weakly Supervised Learning

Curation of large fully supervised datasets has become one of the major ...
research
03/02/2020

Structured Prediction with Partial Labelling through the Infimum Loss

Annotating datasets is one of the main costs in nowadays supervised lear...
research
02/08/2022

Data Consistency for Weakly Supervised Learning

In many applications, training machine learning models involves using la...
research
06/23/2023

On Learning Latent Models with Multi-Instance Weak Supervision

We consider a weakly supervised learning scenario where the supervision ...
research
09/23/2022

From Weakly Supervised Learning to Active Learning

Applied mathematics and machine computations have raised a lot of hope s...
research
01/03/2017

Constrained Deep Weak Supervision for Histopathology Image Segmentation

In this paper, we develop a new weakly-supervised learning algorithm to ...
research
05/26/2020

Learning with Weak Supervision for Email Intent Detection

Email remains one of the most frequently used means of online communicat...

Please sign up or login with your details

Forgot password? Click here to reset