DeepAI
Log In Sign Up

Asymmetric Loss For Multi-Label Classification

09/29/2020
by   Emanuel Ben-Baruch, et al.
0

Pictures of everyday life are inherently multi-label in nature. Hence, multi-label classification is commonly used to analyze their content. In typical multi-label datasets, each picture contains only a few positive labels, and many negative ones. This positive-negative imbalance can result in under-emphasizing gradients from positive labels during training, leading to poor accuracy. In this paper, we introduce a novel asymmetric loss ("ASL"), that operates differently on positive and negative samples. The loss dynamically down-weights the importance of easy negative samples, causing the optimization process to focus more on the positive samples, and also enables to discard mislabeled negative samples. We demonstrate how ASL leads to a more "balanced" network, with increased average probabilities for positive samples, and show how this balanced network is translated to better mAP scores, compared to commonly used losses. Furthermore, we offer a method that can dynamically adjust the level of asymmetry throughout the training. With ASL, we reach new state-of-the-art results on three common multi-label datasets, including achieving 86.6 tasks such as fine-grain single-label classification and object detection. ASL is effective, easy to implement, and does not increase the training time or complexity. Implementation is available at: https://github.com/Alibaba-MIIL/ASL.

READ FULL TEXT

page 1

page 2

page 3

page 4

03/30/2022

Acknowledging the Unknown for Multi-label Learning with Single Positive Labels

Due to the difficulty of collecting exhaustive multi-label annotations, ...
09/03/2022

Label Structure Preserving Contrastive Embedding for Multi-Label Learning with Missing Labels

Contrastive learning (CL) has shown impressive advances in image represe...
09/27/2021

Speeding-up One-vs-All Training for Extreme Classification via Smart Initialization

In this paper we show that a simple, data dependent way of setting the i...
09/10/2021

Balancing Methods for Multi-label Text Classification with Long-Tailed Class Distribution

Multi-label text classification is a challenging task because it require...
09/13/2022

Class-Level Logit Perturbation

Features, logits, and labels are the three primary data when a sample pa...
10/21/2021

Multi-label Classification with Partial Annotations using Class-aware Selective Loss

Large-scale multi-label classification datasets are commonly, and perhap...
09/14/2022

A patch-based architecture for multi-label classification from single label annotations

In this paper, we propose a patch-based architecture for multi-label cla...

Code Repositories

ASL

Official Pytorch Implementation of: "Asymmetric Loss For Multi-Label Classification"(2020) paper


view repo