Heteroscedasticity-aware residuals-based contextual stochastic optimization

01/08/2021
by   Rohit Kannan, et al.
0

We explore generalizations of some integrated learning and optimization frameworks for data-driven contextual stochastic optimization that can adapt to heteroscedasticity. We identify conditions on the stochastic program, data generation process, and the prediction setup under which these generalizations possess asymptotic and finite sample guarantees for a class of stochastic programs, including two-stage stochastic mixed-integer programs with continuous recourse. We verify that our assumptions hold for popular parametric and nonparametric regression methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/27/2022

Data-Driven Sample Average Approximation with Covariate Information

We study optimization for data-driven decision-making when we have obser...
research
06/12/2020

Stochastic Optimization for Performative Prediction

In performative prediction, the choice of a model influences the distrib...
research
07/15/2021

Learning Mixed-Integer Linear Programs from Contextual Examples

Mixed-integer linear programs (MILPs) are widely used in artificial inte...
research
04/13/2023

Estimate-Then-Optimize Versus Integrated-Estimation-Optimization: A Stochastic Dominance Perspective

In data-driven stochastic optimization, model parameters of the underlyi...
research
10/08/2020

Emergent Jaw Predominance in Vocal Development through Stochastic Optimization

Infant vocal babbling strongly relies on jaw oscillations, especially at...
research
12/07/2019

Parameterized Algorithms for MILPs with Small Treedepth

Solving (mixed) integer linear programs, (M)ILPs for short, is a fundame...
research
02/13/2018

Superposition-Assisted Stochastic Optimization for Hawkes Processes

We consider the learning of multi-agent Hawkes processes, a model contai...

Please sign up or login with your details

Forgot password? Click here to reset