MetaInfoNet: Learning Task-Guided Information for Sample Reweighting

12/09/2020
by   Hongxin Wei, et al.
0

Deep neural networks have been shown to easily overfit to biased training data with label noise or class imbalance. Meta-learning algorithms are commonly designed to alleviate this issue in the form of sample reweighting, by learning a meta weighting network that takes training losses as inputs to generate sample weights. In this paper, we advocate that choosing proper inputs for the meta weighting network is crucial for desired sample weights in a specific task, while training loss is not always the correct answer. In view of this, we propose a novel meta-learning algorithm, MetaInfoNet, which automatically learns effective representations as inputs for the meta weighting network by emphasizing task-related information with an information bottleneck strategy. Extensive experimental results on benchmark datasets with label noise or class imbalance validate that MetaInfoNet is superior to many state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/30/2021

Delving into Sample Loss Curve to Embrace Noisy and Imbalanced Data

Corrupted labels and class imbalance are commonly encountered in practic...
research
02/11/2022

CMW-Net: Learning a Class-Aware Sample Weighting Mapping for Robust Deep Learning

Modern deep neural networks can easily overfit to biased training data c...
research
01/07/2020

Dynamic Task Weighting Methods for Multi-task Networks in Autonomous Driving Systems

Deep multi-task networks are of particular interest for autonomous drivi...
research
10/29/2021

Generalized Data Weighting via Class-level Gradient Manipulation

Label noise and class imbalance are two major issues coexisting in real-...
research
05/10/2022

Calibrating for Class Weights by Modeling Machine Learning

A much studied issue is the extent to which the confidence scores provid...
research
04/19/2021

Do We Really Need Gold Samples for Sample Weighting Under Label Noise?

Learning with labels noise has gained significant traction recently due ...
research
11/08/2021

Learning to Rectify for Robust Learning with Noisy Labels

Label noise significantly degrades the generalization ability of deep mo...

Please sign up or login with your details

Forgot password? Click here to reset