One-Step Abductive Multi-Target Learning with Diverse Noisy Samples

10/20/2021
by   Yongquan Yang, et al.
0

One-step abductive multi-target learning (OSAMTL) was proposed to handle complex noisy labels. In this paper, giving definition of diverse noisy samples (DNS), we propose one-step abductive multi-target learning with DNS (OSAMTL-DNS) to expand the original OSAMTL to a wider range of tasks that handle complex noisy labels.

READ FULL TEXT
POST COMMENT

Comments

There are no comments yet.

Authors

page 1

page 2

page 3

page 4

11/25/2020

Handling Noisy Labels via One-Step Abductive Multi-Target Learning

Learning from noisy labels is an important concern because of the lack o...
08/14/2020

The Impact of Label Noise on a Music Tagger

We explore how much can be learned from noisy labels in audio music tagg...
12/02/2021

Sample Prior Guided Robust Model Learning to Suppress Noisy Labels

Imperfect labels are ubiquitous in real-world datasets and seriously har...
05/24/2018

SOSELETO: A Unified Approach to Transfer Learning and Training with Noisy Labels

We present SOSELETO (SOurce SELEction for Target Optimization), a new me...
07/10/2020

ExpertNet: Adversarial Learning and Recovery Against Noisy Labels

Today's available datasets in the wild, e.g., from social media and open...
12/08/2020

Robustness of Accuracy Metric and its Inspirations in Learning with Noisy Labels

For multi-class classification under class-conditional label noise, we p...
04/18/2018

Co-sampling: Training Robust Networks for Extremely Noisy Supervision

Training robust deep networks is challenging under noisy labels. Current...
This week in AI

Get the week's most popular data science and artificial intelligence research sent straight to your inbox every Saturday.