On the Robustness of Monte Carlo Dropout Trained with Noisy Labels

03/22/2021
by   Purvi Goel, et al.
0

The memorization effect of deep learning hinders its performance to effectively generalize on test set when learning with noisy labels. Prior study has discovered that epistemic uncertainty techniques are robust when trained with noisy labels compared with neural networks without uncertainty estimation. They obtain prolonged memorization effect and better generalization performance under the adversarial setting of noisy labels. Due to its superior performance amongst other selected epistemic uncertainty methods under noisy labels, we focus on Monte Carlo Dropout (MCDropout) and investigate why it is robust when trained with noisy labels. Through empirical studies on datasets MNIST, CIFAR-10, Animal-10n, we deep dive into three aspects of MCDropout under noisy label setting: 1. efficacy: understanding the learning behavior and test accuracy of MCDropout when training set contains artificially generated or naturally embedded label noise; 2. representation volatility: studying the responsiveness of neurons by examining the mean and standard deviation on each neuron's activation; 3. network sparsity: investigating the network support of MCDropout in comparison with deterministic neural networks. Our findings suggest that MCDropout further sparsifies and regularizes the deterministic neural networks and thus provides higher robustness against noisy labels.

READ FULL TEXT
research
09/17/2019

Learn to Estimate Labels Uncertainty for Quality Assurance

Deep Learning sets the state-of-the-art in many challenging tasks showin...
research
05/30/2017

Deep Learning is Robust to Massive Label Noise

Deep neural networks trained on large supervised datasets have led to im...
research
05/09/2017

Learning Deep Networks from Noisy Labels with Dropout Regularization

Large datasets often have unreliable labels-such as those obtained from ...
research
10/27/2020

Know Where To Drop Your Weights: Towards Faster Uncertainty Estimation

Estimating epistemic uncertainty of models used in low-latency applicati...
research
02/19/2020

Improving Generalization by Controlling Label-Noise Information in Neural Network Weights

In the presence of noisy or incorrect labels, neural networks have the u...
research
09/05/2022

Improving Out-of-Distribution Detection via Epistemic Uncertainty Adversarial Training

The quantification of uncertainty is important for the adoption of machi...
research
11/03/2020

Uncertainty Estimation in Medical Image Localization: Towards Robust Anterior Thalamus Targeting for Deep Brain Stimulation

Atlas-based methods are the standard approaches for automatic targeting ...

Please sign up or login with your details

Forgot password? Click here to reset