Class-Imbalanced Complementary-Label Learning via Weighted Loss

09/28/2022
by   Meng Wei, et al.
0

Complementary-label learning (CLL) is a common application in the scenario of weak supervision. However, in real-world datasets, CLL encounters class-imbalanced training samples, where the quantity of samples of one class is significantly lower than those of other classes. Unfortunately, existing CLL approaches have yet to explore the problem of class-imbalanced samples, which reduces the prediction accuracy, especially in imbalanced classes. In this paper, we propose a novel problem setting to allow learning from class-imbalanced complementarily labeled samples for multi-class classification. Accordingly, to deal with this novel problem, we propose a new CLL approach, called Weighted Complementary-Label Learning (WCLL). The proposed method models a weighted empirical risk minimization loss by utilizing the class-imbalanced complementarily labeled information, which is also applicable to multi-class imbalanced training samples. Furthermore, the estimation error bound of the proposed method was derived to provide a theoretical guarantee. Finally, we do extensive experiments on widely-used benchmark datasets to validate the superiority of our method by comparing it with existing state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/01/2023

Learning from Stochastic Labels

Annotating multi-class instances is a crucial task in the field of machi...
research
02/04/2017

Latent Hinge-Minimax Risk Minimization for Inference from a Small Number of Training Samples

Deep Learning (DL) methods show very good performance when trained on la...
research
11/01/2019

Novelty Detection and Learning from Extremely Weak Supervision

In this paper we offer a method and algorithm, which make possible fully...
research
05/24/2022

Deep Reinforcement Learning for Multi-class Imbalanced Training

With the rapid growth of memory and computing power, datasets are becomi...
research
08/29/2017

EC3: Combining Clustering and Classification for Ensemble Learning

Classification and clustering algorithms have been proved to be successf...
research
03/24/2021

A Novel Adaptive Minority Oversampling Technique for Improved Classification in Data Imbalanced Scenarios

Imbalance in the proportion of training samples belonging to different c...

Please sign up or login with your details

Forgot password? Click here to reset