DeepAI AI Chat
Log In Sign Up

Improved Mix-up with KL-Entropy for Learning From Noisy Labels

08/15/2019
by   Qian Zhang, et al.
0

Despite the deep neural networks (DNN) has achieved excellent performance in image classification researches, the training of DNNs needs a large of clean data with accurate annotations. The collect of a dataset is easy, but it is difficult to annotate the collecting data. On the websites, there exist a lot of image data which contains inaccurate annotations, but training on these datasets may make networks easier to over-fit the noisy labels and cause performance degradation. In this work, we propose an improved joint optimization framework, which mixed the mix-up entropy and Kullback-Leibler (KL) entropy as the loss function. The new loss function can give the better fine-tuning after the framework updates both the label annotations. We conduct experiments on CIFAR-10 dataset and Clothing1M dataset. The result shows the advantageous performance of our approach compared with other state-of-the-art methods.

READ FULL TEXT
03/30/2018

Joint Optimization Framework for Learning with Noisy Labels

Deep neural networks (DNNs) trained on large-scale datasets have exhibit...
12/13/2018

Learning to Learn from Noisy Labeled Data

Despite the success of deep neural networks (DNNs) in image classificati...
02/23/2021

Winning Ticket in Noisy Image Classification

Modern deep neural networks (DNNs) become frail when the datasets contai...
03/31/2021

Collaborative Label Correction via Entropy Thresholding

Deep neural networks (DNNs) have the capacity to fit extremely noisy lab...
08/16/2019

Symmetric Cross Entropy for Robust Learning with Noisy Labels

Training accurate deep neural networks (DNNs) in the presence of noisy l...
04/14/2021

Joint Negative and Positive Learning for Noisy Labels

Training of Convolutional Neural Networks (CNNs) with data with noisy la...
12/21/2020

LQF: Linear Quadratic Fine-Tuning

Classifiers that are linear in their parameters, and trained by optimizi...