Scalable Penalized Regression for Noise Detection in Learning with Noisy Labels

03/15/2022
by   Yikai Wang, et al.
0

Noisy training set usually leads to the degradation of generalization and robustness of neural networks. In this paper, we propose using a theoretically guaranteed noisy label detection framework to detect and remove noisy data for Learning with Noisy Labels (LNL). Specifically, we design a penalized regression to model the linear relation between network features and one-hot labels, where the noisy data are identified by the non-zero mean shift parameters solved in the regression model. To make the framework scalable to datasets that contain a large number of categories and training data, we propose a split algorithm to divide the whole training set into small pieces that can be solved by the penalized regression in parallel, leading to the Scalable Penalized Regression (SPR) framework. We provide the non-asymptotic probabilistic condition for SPR to correctly identify the noisy data. While SPR can be regarded as a sample selection module for standard supervised training pipeline, we further combine it with semi-supervised algorithm to further exploit the support of noisy data as unlabeled data. Experimental results on several benchmark datasets and real-world noisy datasets show the effectiveness of our framework. Our code and pretrained models are released at https://github.com/Yikai-Wang/SPR-LNL.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
01/02/2023

Knockoffs-SPR: Clean Sample Selection in Learning with Noisy Labels

A noisy training set usually leads to the degradation of the generalizat...
research
10/11/2022

C-Mixup: Improving Generalization in Regression

Improving the generalization of deep networks is an important open chall...
research
12/08/2022

Leveraging Unlabeled Data to Track Memorization

Deep neural networks may easily memorize noisy labels present in real-wo...
research
02/08/2018

A Semi-Supervised Two-Stage Approach to Learning from Noisy Labels

The recent success of deep neural networks is powered in part by large-s...
research
03/25/2021

Contrast to Divide: Self-Supervised Pre-Training for Learning with Noisy Labels

The success of learning with noisy labels (LNL) methods relies heavily o...
research
07/24/2016

Interactive Learning from Multiple Noisy Labels

Interactive learning is a process in which a machine learning algorithm ...
research
06/05/2016

Active Regression with Adaptive Huber Loss

This paper addresses the scalar regression problem through a novel solut...

Please sign up or login with your details

Forgot password? Click here to reset