On Robust Learning from Noisy Labels: A Permutation Layer Approach

11/29/2022
by   Salman Alsubaihi, et al.
0

The existence of label noise imposes significant challenges (e.g., poor generalization) on the training process of deep neural networks (DNN). As a remedy, this paper introduces a permutation layer learning approach termed PermLL to dynamically calibrate the training process of the DNN subject to instance-dependent and instance-independent label noise. The proposed method augments the architecture of a conventional DNN by an instance-dependent permutation layer. This layer is essentially a convex combination of permutation matrices that is dynamically calibrated for each sample. The primary objective of the permutation layer is to correct the loss of noisy samples mitigating the effect of label noise. We provide two variants of PermLL in this paper: one applies the permutation layer to the model's prediction, while the other applies it directly to the given noisy label. In addition, we provide a theoretical comparison between the two variants and show that previous methods can be seen as one of the variants. Finally, we validate PermLL experimentally and show that it achieves state-of-the-art performance on both real and synthetic datasets.

READ FULL TEXT

page 4

page 7

research
10/05/2020

Learning with Instance-Dependent Label Noise: A Sample Sieve Approach

Human-annotated labels are often prone to noise, and the presence of suc...
research
05/28/2021

Rethinking Noisy Label Models: Labeler-Dependent Noise with Adversarial Awareness

Most studies on learning from noisy labels rely on unrealistic models of...
research
12/06/2021

Two Wrongs Don't Make a Right: Combating Confirmation Bias in Learning with Label Noise

Noisy labels damage the performance of deep networks. For robust learnin...
research
09/02/2022

Instance-Dependent Noisy Label Learning via Graphical Modelling

Noisy labels are unavoidable yet troublesome in the ecosystem of deep le...
research
07/10/2023

Leveraging an Alignment Set in Tackling Instance-Dependent Label Noise

Noisy training labels can hurt model performance. Most approaches that a...
research
05/28/2022

Deep Learning with Label Noise: A Hierarchical Approach

Deep neural networks are susceptible to label noise. Existing methods to...
research
10/28/2019

Interrupted and cascaded permutation invariant training for speech separation

Permutation Invariant Training (PIT) has long been a stepping stone meth...

Please sign up or login with your details

Forgot password? Click here to reset