Unimodal-Concentrated Loss: Fully Adaptive Label Distribution Learning for Ordinal Regression

04/01/2022
by   Qiang Li, et al.
0

Learning from a label distribution has achieved promising results on ordinal regression tasks such as facial age and head pose estimation wherein, the concept of adaptive label distribution learning (ALDL) has drawn lots of attention recently for its superiority in theory. However, compared with the methods assuming fixed form label distribution, ALDL methods have not achieved better performance. We argue that existing ALDL algorithms do not fully exploit the intrinsic properties of ordinal regression. In this paper, we emphatically summarize that learning an adaptive label distribution on ordinal regression tasks should follow three principles. First, the probability corresponding to the ground-truth should be the highest in label distribution. Second, the probabilities of neighboring labels should decrease with the increase of distance away from the ground-truth, i.e., the distribution is unimodal. Third, the label distribution should vary with samples changing, and even be distinct for different instances with the same label, due to the different levels of difficulty and ambiguity. Under the premise of these principles, we propose a novel loss function for fully adaptive label distribution learning, namely unimodal-concentrated loss. Specifically, the unimodal loss derived from the learning to rank strategy constrains the distribution to be unimodal. Furthermore, the estimation error and the variance of the predicted distribution for a specific sample are integrated into the proposed concentrated loss to make the predicted distribution maximize at the ground-truth and vary according to the predicting uncertainty. Extensive experimental results on typical ordinal regression tasks including age and head pose estimation, show the superiority of our proposed unimodal-concentrated loss compared with existing loss functions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/06/2016

Deep Label Distribution Learning with Label Ambiguity

Convolutional Neural Networks (ConvNets) have achieved excellent recogni...
research
03/03/2021

PML: Progressive Margin Loss for Long-tailed Age Classification

In this paper, we propose a progressive margin loss (PML) approach for u...
research
03/31/2022

Adaptive Mean-Residue Loss for Robust Facial Age Estimation

Automated facial age estimation has diverse real-world applications in m...
research
09/05/2022

Full Kullback-Leibler-Divergence Loss for Hyperparameter-free Label Distribution Learning

The concept of Label Distribution Learning (LDL) is a technique to stabi...
research
07/09/2021

Batch Inverse-Variance Weighting: Deep Heteroscedastic Regression

Heteroscedastic regression is the task of supervised learning where each...
research
08/19/2022

Towards Unbiased Label Distribution Learning for Facial Pose Estimation Using Anisotropic Spherical Gaussian

Facial pose estimation refers to the task of predicting face orientation...
research
09/14/2016

Joint Gender Classification and Age Estimation by Nearly Orthogonalizing Their Semantic Spaces

In human face-based biometrics, gender classification and age estimation...

Please sign up or login with your details

Forgot password? Click here to reset