Training Over-parameterized Models with Non-decomposable Objectives

07/09/2021
by   Harikrishna Narasimhan, et al.
0

Many modern machine learning applications come with complex and nuanced design goals such as minimizing the worst-case error, satisfying a given precision or recall target, or enforcing group-fairness constraints. Popular techniques for optimizing such non-decomposable objectives reduce the problem into a sequence of cost-sensitive learning tasks, each of which is then solved by re-weighting the training loss with example-specific costs. We point out that the standard approach of re-weighting the loss to incorporate label costs can produce unsatisfactory results when used to train over-parameterized models. As a remedy, we propose new cost-sensitive losses that extend the classical idea of logit adjustment to handle more general cost matrices. Our losses are calibrated, and can be further improved with distilled labels from a teacher model. Through experiments on benchmark image datasets, we showcase the effectiveness of our approach in training ResNet models with common robust and constrained optimization objectives.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/11/2018

When optimizing nonlinear objectives is no harder than linear objectives

Most systems and learning algorithms optimize average performance or ave...
research
10/12/2021

Balancing Average and Worst-case Accuracy in Multitask Learning

When training and evaluating machine learning models on a large number o...
research
10/05/2020

Learning by Minimizing the Sum of Ranked Range

In forming learning objectives, one oftentimes needs to aggregate a set ...
research
07/04/2017

Robust Optimization for Non-Convex Objectives

We consider robust optimization problems, where the goal is to optimize ...
research
07/28/2020

Distributionally Robust Losses for Latent Covariate Mixtures

While modern large-scale datasets often consist of heterogeneous subpopu...
research
05/27/2018

Metric-Optimized Example Weights

Real-world machine learning applications often have complex test metrics...
research
04/28/2023

Cost-Sensitive Self-Training for Optimizing Non-Decomposable Metrics

Self-training based semi-supervised learning algorithms have enabled the...

Please sign up or login with your details

Forgot password? Click here to reset