DeepAI AI Chat
Log In Sign Up

Distributionally robust tail bounds based on Wasserstein distance and f-divergence

by   Corina Birghila, et al.

In this work, we provide robust bounds on the tail probabilities and the tail index of heavy-tailed distributions in the context of model misspecification. They are defined as the optimal value when computing the worst-case tail behavior over all models within some neighborhood of the reference model. The choice of the discrepancy between the models used to build this neighborhood plays a crucial role in assessing the size of the asymptotic bounds. We evaluate the robust tail behavior in ambiguity sets based on the Wasserstein distance and Csiszár f-divergence and obtain explicit expressions for the corresponding asymptotic bounds. In an application to Danish fire insurance claims we compare the difference between these bounds and show the importance of the choice of discrepancy measure.


page 1

page 2

page 3

page 4


Ambiguity set and learning via Bregman and Wasserstein

Construction of ambiguity set in robust optimization relies on the choic...

On the heavy-tail behavior of the distributionally robust newsvendor

Since the seminal work of Scarf (1958) [A min-max solution of an invento...

Inference for heavy-tailed data with Gaussian dependence

We consider a model for multivariate data with heavy-tailed marginal dis...

On the tails of the limiting QuickSort density

We give upper and lower asymptotic bounds for the left tail and for the ...

Nudge: Stochastically Improving upon FCFS

The First-Come First-Served (FCFS) scheduling policy is the most popular...

Understanding Distributional Ambiguity via Non-robust Chance Constraint

The choice of the ambiguity radius is critical when an investor uses the...

Keep It Real: Tail Probabilities of Compound Heavy-Tailed Distributions

We propose an analytical approach to the computation of tail probabiliti...