Fighting over-fitting with quantization for learning deep neural networks on noisy labels

03/21/2023
by   Gauthier Tallec, et al.
0

The rising performance of deep neural networks is often empirically attributed to an increase in the available computational power, which allows complex models to be trained upon large amounts of annotated data. However, increased model complexity leads to costly deployment of modern neural networks, while gathering such amounts of data requires huge costs to avoid label noise. In this work, we study the ability of compression methods to tackle both of these problems at once. We hypothesize that quantization-aware training, by restricting the expressivity of neural networks, behaves as a regularization. Thus, it may help fighting overfitting on noisy data while also allowing for the compression of the model at inference. We first validate this claim on a controlled test with manually introduced label noise. Furthermore, we also test the proposed method on Facial Action Unit detection, where labels are typically noisy due to the subtlety of the task. In all cases, our results suggests that quantization significantly improve the results compared with existing baselines, regularization as well as other compression methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/16/2020

Learning from Noisy Labels with Deep Neural Networks: A Survey

Deep learning has achieved remarkable success in numerous domains with h...
research
10/20/2019

Leveraging inductive bias of neural networks for learning without explicit human annotations

Classification problems today are typically solved by first collecting e...
research
04/17/2020

Quantization Guided JPEG Artifact Correction

The JPEG image compression algorithm is the most popular method of image...
research
07/01/2020

Temporal Calibrated Regularization for Robust Noisy Label Learning

Deep neural networks (DNNs) exhibit great success on many tasks with the...
research
09/28/2018

Pumpout: A Meta Approach for Robustly Training Deep Neural Networks with Noisy Labels

It is challenging to train deep neural networks robustly on the industri...
research
09/10/2021

On the Compression of Neural Networks Using ℓ_0-Norm Regularization and Weight Pruning

Despite the growing availability of high-capacity computational platform...
research
08/23/2023

SILT: Shadow-aware Iterative Label Tuning for Learning to Detect Shadows from Noisy Labels

Existing shadow detection datasets often contain missing or mislabeled s...

Please sign up or login with your details

Forgot password? Click here to reset