Higher-Order Expansion and Bartlett Correctability of Distributionally Robust Optimization

08/11/2021
by   Shengyi He, et al.
0

Distributionally robust optimization (DRO) is a worst-case framework for stochastic optimization under uncertainty that has drawn fast-growing studies in recent years. When the underlying probability distribution is unknown and observed from data, DRO suggests to compute the worst-case distribution within a so-called uncertainty set that captures the involved statistical uncertainty. In particular, DRO with uncertainty set constructed as a statistical divergence neighborhood ball has been shown to provide a tool for constructing valid confidence intervals for nonparametric functionals, and bears a duality with the empirical likelihood (EL). In this paper, we show how adjusting the ball size of such type of DRO can reduce higher-order coverage errors similar to the Bartlett correction. Our correction, which applies to general von Mises differentiable functionals, is more general than the existing EL literature that only focuses on smooth function models or M-estimation. Moreover, we demonstrate a higher-order "self-normalizing" property of DRO regardless of the choice of divergence. Our approach builds on the development of a higher-order expansion of DRO, which is obtained through an asymptotic analysis on a fixed point equation arising from the Karush-Kuhn-Tucker conditions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/11/2016

Statistics of Robust Optimization: A Generalized Empirical Likelihood Approach

We study statistical inference and robust solution methods for stochasti...
research
11/02/2022

Higher order approximation for constructing confidence intervals in time series

For time series with high temporal correlation, the empirical process co...
research
05/31/2021

Conformal Uncertainty Sets for Robust Optimization

Decision-making under uncertainty is hugely important for any decisions ...
research
06/21/2019

Guaranteed Validity for Empirical Approaches to Adaptive Data Analysis

We design a general framework for answering adaptive statistical queries...
research
01/14/2020

A Higher-Order Correct Fast Moving-Average Bootstrap for Dependent Data

We develop and implement a novel fast bootstrap for dependent data. Our ...
research
04/16/2018

On the Equivalence of f-Divergence Balls and Density Bands in Robust Detection

The paper deals with minimax optimal statistical tests for two composite...
research
07/22/2020

Robust Machine Learning via Privacy/Rate-Distortion Theory

Robust machine learning formulations have emerged to address the prevale...

Please sign up or login with your details

Forgot password? Click here to reset