Omnipredictors for Constrained Optimization

09/15/2022
by   Lunjia Hu, et al.
2

The notion of omnipredictors (Gopalan, Kalai, Reingold, Sharan and Wieder ITCS 2021), suggested a new paradigm for loss minimization. Rather than learning a predictor based on a known loss function, omnipredictors can easily be post-processed to minimize any one of a rich family of loss functions compared with the loss of a class C. It has been shown that such omnipredictors exist and are implied (for all convex and Lipschitz loss functions) by the notion of multicalibration from the algorithmic fairness literature. Nevertheless, it is often the case that the action selected must obey some additional constraints (such as capacity or parity constraints). In itself, the original notion of omnipredictors does not apply in this well-motivated and heavily studied the context of constrained loss minimization. In this paper, we introduce omnipredictors for constrained optimization and study their complexity and implications. The notion that we introduce allows the learner to be unaware of the loss function that will be later assigned as well as the constraints that will be later imposed, as long as the subpopulations that are used to define these constraints are known. The paper shows how to obtain omnipredictors for constrained optimization problems, relying on appropriate variants of multicalibration. For some interesting constraints and general loss functions and for general constraints and some interesting loss functions, we show how omnipredictors are implied by a variant of multicalibration that is similar in complexity to standard multicalibration. We demonstrate that in the general case, standard multicalibration is insufficient and show that omnipredictors are implied by multicalibration with respect to a class containing all the level sets of hypotheses in C. We also investigate the implications when the constraints are group fairness notions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/11/2021

Omnipredictors

Loss minimization is a dominant paradigm in machine learning, where a pr...
research
05/20/2021

Multi-group Agnostic PAC Learnability

An agnostic PAC learning algorithm finds a predictor that is competitive...
research
02/13/2023

Characterizing notions of omniprediction via multicalibration

A recent line of work shows that notions of multigroup fairness imply su...
research
01/30/2013

Decision Theoretic Foundations of Graphical Model Selection

This paper describes a decision theoretic formulation of learning the gr...
research
07/18/2023

Oracle Efficient Online Multicalibration and Omniprediction

A recent line of work has shown a surprising connection between multical...
research
10/16/2022

Loss Minimization through the Lens of Outcome Indistinguishability

We present a new perspective on loss minimization and the recent notion ...
research
02/18/2022

Signal Decomposition Using Masked Proximal Operators

We consider the well-studied problem of decomposing a vector time series...

Please sign up or login with your details

Forgot password? Click here to reset