Distributionally Robust Optimization with Correlated Data from Vector Autoregressive Processes

09/08/2019
by   Xialiang Dou, et al.
0

We present a distributionally robust formulation of a stochastic optimization problem for non-i.i.d vector autoregressive data. We use the Wasserstein distance to define robustness in the space of distributions and we show, using duality theory, that the problem is equivalent to a finite convex-concave saddle point problem. The performance of the method is demonstrated on both synthetic and real data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/20/2021

Distributionally Robust Learning

This monograph develops a comprehensive statistical learning framework t...
research
11/02/2021

Outlier-Robust Optimal Transport: Duality, Structure, and Statistical Analysis

The Wasserstein distance, rooted in optimal transport (OT) theory, is a ...
research
06/08/2020

A Stochastic Subgradient Method for Distributionally Robust Non-Convex Learning

We consider a distributionally robust formulation of stochastic optimiza...
research
06/29/2020

Inference in Bayesian Additive Vector Autoregressive Tree Models

Vector autoregressive (VAR) models assume linearity between the endogeno...
research
04/30/2022

A Simple Duality Proof for Wasserstein Distributionally Robust Optimization

We present a short and elementary proof of the duality for Wasserstein d...
research
05/02/2019

Lifting Vectorial Variational Problems: A Natural Formulation based on Geometric Measure Theory and Discrete Exterior Calculus

Numerous tasks in imaging and vision can be formulated as variational pr...
research
01/28/2020

A Kernel Mean Embedding Approach to Reducing Conservativeness in Stochastic Programming and Control

We apply kernel mean embedding methods to sample-based stochastic optimi...

Please sign up or login with your details

Forgot password? Click here to reset