Consistent Estimators for Learning to Defer to an Expert

06/02/2020
by   Hussein Mozannar, et al.
6

Learning algorithms are often used in conjunction with expert decision makers in practical scenarios, however this fact is largely ignored when designing these algorithms. In this paper we explore how to learn predictors that can either predict or choose to defer the decision to a downstream expert. Given only samples of the expert's decisions, we give a procedure based on learning a classifier and a rejector and analyze it theoretically. Our approach is based on a novel reduction to cost sensitive learning where we give a consistent surrogate loss for cost sensitive learning that generalizes the cross entropy loss. We show the effectiveness of our approach on a variety of experimental tasks.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/19/2022

Sample Efficient Learning of Predictors that Complement Humans

One of the goals of learning algorithms is to complement and reduce the ...
research
05/15/2015

Consistent Algorithms for Multiclass Classification with a Reject Option

We consider the problem of n-class classification (n≥ 2), where the clas...
research
07/14/2021

Efficient Learning of Pinball TWSVM using Privileged Information and its applications

In any learning framework, an expert knowledge always plays a crucial ro...
research
05/18/2020

Niose-Sampling Cross Entropy Loss: Improving Disparity Regression Via Cost Volume Aware Regularizer

Recent end-to-end deep neural networks for disparity regression have ach...
research
10/30/2022

Learning to Defer to Multiple Experts: Consistent Surrogate Losses, Confidence Calibration, and Conformal Ensembles

We study the statistical properties of learning to defer (L2D) to multip...
research
03/02/2021

Categorical Foundations of Gradient-Based Learning

We propose a categorical foundation of gradient-based machine learning a...

Please sign up or login with your details

Forgot password? Click here to reset