Label-similarity Curriculum Learning

11/15/2019
by   Urun Dogan, et al.
0

Curriculum learning can improve neural network training by guiding the optimization to desirable optima. We propose a novel curriculum learning approach for image classification that adapts the loss function by changing the label representation. The idea is to use a probability distribution over classes as target label, where the class probabilities reflect the similarity to the true class. Gradually, this label representation is shifted towards the standard one-hot-encoding. That is, in the beginning minor mistakes are corrected less than large mistakes, resembling a teaching process in which broad concepts are explained first before subtle differences are taught. The class similarity can be based on prior knowledge. For the special case of the labels being natural words, we propose a generic way to automatically compute the similarities. The natural words are embedded into Euclidean space using a standard word embedding. The probability of each class is then a function of the cosine similarity between the vector representations of the class and the true label. The proposed label-similarity curriculum learning (LCL) approach was empirically evaluated on several popular deep learning architectures for image classification task applied to three datasets, ImageNet, CIFAR100, and AWA2. In all scenarios, LCL was able to improve the classification accuracy on the test data compared to standard training.

READ FULL TEXT
research
02/27/2021

Statistical Measures For Defining Curriculum Scoring Function

Curriculum learning is a training strategy that sorts the training examp...
research
07/06/2022

A Deep Model for Partial Multi-Label Image Classification with Curriculum Based Disambiguation

In this paper, we study the partial multi-label (PML) image classificati...
research
06/05/2020

Hierarchical Class-Based Curriculum Loss

Classification algorithms in machine learning often assume a flat label ...
research
12/22/2022

Confidence-Aware Paced-Curriculum Learning by Label Smoothing for Surgical Scene Understanding

Curriculum learning and self-paced learning are the training strategies ...
research
11/17/2019

Learning with Hierarchical Complement Objective

Label hierarchies widely exist in many vision-related problems, ranging ...
research
09/09/2022

Improving Model Training via Self-learned Label Representations

Modern neural network architectures have shown remarkable success in sev...
research
08/01/2023

Beyond One-Hot-Encoding: Injecting Semantics to Drive Image Classifiers

Images are loaded with semantic information that pertains to real-world ...

Please sign up or login with your details

Forgot password? Click here to reset