First-Order Methods for Wasserstein Distributionally Robust MDP

09/14/2020
by   Julien Grand-Clément, et al.
0

Markov Decision Processes (MDPs) are known to be sensitive to parameter specification. Distributionally robust MDPs alleviate this issue by allowing for ambiguity sets which give a set of possible distributions over parameter sets. The goal is to find an optimal policy with respect to the worst-case parameter distribution. We propose a first-order methods framework for solving Distributionally robust MDPs, and instantiate it for several types of Wasserstein ambiguity sets. By developing efficient proximal updates, our algorithms achieve a convergence rate of O(NA^2.5S^3.5log(S)log(ϵ^-1)ϵ^-1.5) for the number of kernels N in the support of the nominal distribution, states S, and actions A (this rate varies slightly based on the Wasserstein setup). Our dependence on N, A and S is significantly better than existing methods; compared to Value Iteration, it is better by a factor of O(N^2.5A S). Numerical experiments on random instances and instances inspired from a machine replacement example show that our algorithm is significantly more scalable than state-of-the-art approaches.

READ FULL TEXT

Please sign up or login with your details

Forgot password? Click here to reset