Fidelity-Weighted Learning

11/08/2017
by   Mostafa Dehghani, et al.
0

Training deep neural networks requires many training samples, but in practice training labels are expensive to obtain and may be of varying quality, as some may be from trusted expert labelers while others might be from heuristics or other sources of weak supervision such as crowd-sourcing. This creates a fundamental quality versus-quantity trade-off in the learning process. Do we learn from the small amount of high-quality data or the potentially large amount of weakly-labeled data? We argue that if the learner could somehow know and take the label-quality into account when learning the data representation, we could get the best of both worlds. To this end, we propose "fidelity-weighted learning" (FWL), a semi-supervised student-teacher approach for training deep neural networks using weakly-labeled data. FWL modulates the parameter updates to a student network (trained on the task we care about) on a per-sample basis according to the posterior confidence of its label-quality estimated by a teacher (who has access to the high-quality labels). Both student and teacher are learned from the data. We evaluate FWL on two tasks in information retrieval and natural language processing where we outperform state-of-the-art alternative semi-supervised methods, indicating that our approach makes better use of strong and weak labels, and leads to better task-dependent data representations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/21/2018

Learning to Rank from Samples of Variable Quality

Training deep neural networks requires many training samples, but in pra...
research
11/01/2017

Avoiding Your Teacher's Mistakes: Training Neural Networks with Controlled Weak Supervision

Training deep neural networks requires massive amounts of training data,...
research
02/06/2021

Jointly Improving Language Understanding and Generation with Quality-Weighted Weak Supervision of Automatic Labeling

Neural natural language generation (NLG) and understanding (NLU) models ...
research
11/30/2017

Learning to Learn from Weak Supervision by Full Supervision

In this paper, we propose a method for training neural networks when we ...
research
10/13/2022

Weighted Distillation with Unlabeled Examples

Distillation with unlabeled examples is a popular and powerful method fo...
research
10/23/2020

A Teacher-Student Framework for Semi-supervised Medical Image Segmentation From Mixed Supervision

Standard segmentation of medical images based on full-supervised convolu...
research
01/15/2018

Student Beats the Teacher: Deep Neural Networks for Lateral Ventricles Segmentation in Brain MR

Ventricular volume and its progression are known to be linked to several...

Please sign up or login with your details

Forgot password? Click here to reset