Analysis and Optimization of Loss Functions for Multiclass, Top-k, and Multilabel Classification

12/12/2016
by   Maksim Lapin, et al.
0

Top-k error is currently a popular performance measure on large scale image classification benchmarks such as ImageNet and Places. Despite its wide acceptance, our understanding of this metric is limited as most of the previous research is focused on its special case, the top-1 error. In this work, we explore two directions that shed more light on the top-k error. First, we provide an in-depth analysis of established and recently proposed single-label multiclass methods along with a detailed account of efficient optimization algorithms for them. Our results indicate that the softmax loss and the smooth multiclass SVM are surprisingly competitive in top-k error uniformly across all k, which can be explained by our analysis of multiclass top-k calibration. Further improvements for a specific k are possible with a number of proposed top-k loss functions. Second, we use the top-k methods to explore the transition from multiclass to multilabel learning. In particular, we find that it is possible to obtain effective multilabel classifiers on Pascal VOC using a single label per image for training, while the gap between multiclass and multilabel methods on MS COCO is more significant. Finally, our contribution of efficient algorithms for training with the considered top-k and multilabel loss functions is of independent interest.

READ FULL TEXT
research
12/01/2015

Loss Functions for Top-k Error: Analysis and Insights

In order to push the performance on realistic computer vision tasks, the...
research
02/27/2021

Learning with Smooth Hinge Losses

Due to the non-smoothness of the Hinge loss in SVM, it is difficult to o...
research
02/28/2017

Learning rates for classification with Gaussian kernels

This paper aims at refined error analysis for binary classification usin...
research
09/29/2020

A Comparative Study of Deep Learning Loss Functions for Multi-Label Remote Sensing Image Classification

This paper analyzes and compares different deep learning loss functions ...
research
09/14/2022

Combining Metric Learning and Attention Heads For Accurate and Efficient Multilabel Image Classification

Multi-label image classification allows predicting a set of labels from ...
research
03/15/2020

Analysis of Softmax Approximation for Deep Classifiers under Input-Dependent Label Noise

Modelling uncertainty arising from input-dependent label noise is an inc...
research
03/26/2014

Beyond L2-Loss Functions for Learning Sparse Models

Incorporating sparsity priors in learning tasks can give rise to simple,...

Please sign up or login with your details

Forgot password? Click here to reset