Robustness to Label Noise Depends on the Shape of the Noise Distribution in Feature Space

06/02/2022
by   Diane Oyen, et al.
0

Machine learning classifiers have been demonstrated, both empirically and theoretically, to be robust to label noise under certain conditions – notably the typical assumption is that label noise is independent of the features given the class label. We provide a theoretical framework that generalizes beyond this typical assumption by modeling label noise as a distribution over feature space. We show that both the scale and the shape of the noise distribution influence the posterior likelihood; and the shape of the noise distribution has a stronger impact on classification performance if the noise is concentrated in feature space where the decision boundary can be moved. For the special case of uniform label noise (independent of features and the class label), we show that the Bayes optimal classifier for c classes is robust to label noise until the ratio of noisy samples goes above c-1/c (e.g. 90 which we call the tipping point. However, for the special case of class-dependent label noise (independent of features given the class label), the tipping point can be as low as 50 noise distribution targets decision boundaries (label noise is directly dependent on feature space), classification robustness can drop off even at a small scale of noise. Even when evaluating recent label-noise mitigation methods we see reduced accuracy when label noise is dependent on features. These findings explain why machine learning often handles label noise well if the noise distribution is uniform in feature-space; yet it also points to the difficulty of overcoming label noise when it is concentrated in a region of feature space where a decision boundary can move.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2021

Label noise detection under the Noise at Random model with ensemble filters

Label noise detection has been widely studied in Machine Learning becaus...
research
03/15/2020

Analysis of Softmax Approximation for Deep Classifiers under Input-Dependent Label Noise

Modelling uncertainty arising from input-dependent label noise is an inc...
research
03/13/2021

Learning with Feature-Dependent Label Noise: A Progressive Approach

Label noise is frequently observed in real-world large-scale datasets. T...
research
10/25/2022

TabMixer: Excavating Label Distribution Learning with Small-scale Features

Label distribution learning (LDL) differs from multi-label learning whic...
research
11/18/2019

Attribute noise robust binary classification

We consider the problem of learning linear classifiers when both feature...
research
01/08/2019

Cost Sensitive Learning in the Presence of Symmetric Label Noise

In binary classification framework, we are interested in making cost sen...
research
03/03/2019

Classification via local manifold approximation

Classifiers label data as belonging to one of a set of groups based on i...

Please sign up or login with your details

Forgot password? Click here to reset