DeepAI AI Chat
Log In Sign Up

Implicit Rate-Constrained Optimization of Non-decomposable Objectives

by   Abhishek Kumar, et al.

We consider a popular family of constrained optimization problems arising in machine learning that involve optimizing a non-decomposable evaluation metric with a certain thresholded form, while constraining another metric of interest. Examples of such problems include optimizing the false negative rate at a fixed false positive rate, optimizing precision at a fixed recall, optimizing the area under the precision-recall or ROC curves, etc. Our key idea is to formulate a rate-constrained optimization that expresses the threshold parameter as a function of the model parameters via the Implicit Function theorem. We show how the resulting optimization problem can be solved using standard gradient based methods. Experiments on benchmark datasets demonstrate the effectiveness of our proposed method over existing state-of-the art approaches for these problems. The code for the proposed method is available at .


Localization Recall Precision (LRP): A New Performance Metric for Object Detection

Average precision (AP), the area under the recall-precision (RP) curve, ...

Constrained Classification and Ranking via Quantiles

In most machine learning applications, classification accuracy is not th...

An Empirical Evaluation of Zeroth-Order Optimization Methods on AI-driven Molecule Optimization

Molecule optimization is an important problem in chemical discovery and ...

Algorithmic Foundation of Deep X-Risk Optimization

X-risk is a term introduced to represent a family of compositional measu...

Nystrom Method for Accurate and Scalable Implicit Differentiation

The essential difficulty of gradient-based bilevel optimization using im...

Scalable Learning of Non-Decomposable Objectives

Modern retrieval systems are often driven by an underlying machine learn...

Helpful Neighbors: Leveraging Neighbors in Geographic Feature Pronunciation

If one sees the place name Houston Mercer Dog Run in New York, how does ...