Adversarial Labeling for Learning without Labels

05/22/2018
by   Chidubem Arachie, et al.
0

We consider the task of training classifiers without labels. We propose a weakly supervised method---adversarial label learning---that trains classifiers to perform well against an adversary that chooses labels for training data. The weak supervision constrains what labels the adversary can choose. The method therefore minimizes an upper bound of the classifier's error rate using projected primal-dual subgradient descent. Minimizing this bound protects against bias and dependencies in the weak supervision. Experiments on three real datasets show that our method can train without labels and outperforms other approaches for weakly supervised learning.

READ FULL TEXT

page 6

page 7

page 8

research
09/15/2020

Constrained Labeling for Weakly Supervised Learning

Curation of large fully supervised datasets has become one of the major ...
research
02/19/2023

Weakly Supervised Label Learning Flows

Supervised learning usually requires a large amount of labelled data. Ho...
research
05/30/2023

Understanding temporally weakly supervised training: A case study for keyword spotting

The currently most prominent algorithm to train keyword spotting (KWS) m...
research
09/05/2018

Learning Concept Abstractness Using Weak Supervision

We introduce a weakly supervised approach for inferring the property of ...
research
07/14/2019

More Supervision, Less Computation: Statistical-Computational Tradeoffs in Weakly Supervised Learning

We consider the weakly supervised binary classification problem where th...
research
09/16/2021

KnowMAN: Weakly Supervised Multinomial Adversarial Networks

The absence of labeled data for training neural models is often addresse...
research
08/30/2021

Noisy Labels for Weakly Supervised Gamma Hadron Classification

Gamma hadron classification, a central machine learning task in gamma ra...

Please sign up or login with your details

Forgot password? Click here to reset