Log In Sign Up

Risk-Averse Stochastic Shortest Path Planning

by   Mohamadreza Ahmadi, et al.

We consider the stochastic shortest path planning problem in MDPs, i.e., the problem of designing policies that ensure reaching a goal state from a given initial state with minimum accrued cost. In order to account for rare but important realizations of the system, we consider a nested dynamic coherent risk total cost functional rather than the conventional risk-neutral total expected cost. Under some assumptions, we show that optimal, stationary, Markovian policies exist and can be found via a special Bellman's equation. We propose a computational technique based on difference convex programs (DCPs) to find the associated value functions and therefore the risk-averse policies. A rover navigation MDP is used to illustrate the proposed methodology with conditional-value-at-risk (CVaR) and entropic-value-at-risk (EVaR) coherent risk measures.


page 1

page 2

page 3

page 4


Lexicographic Optimisation of Conditional Value at Risk and Expected Value for Risk-Averse Planning in MDPs

Planning in Markov decision processes (MDPs) typically optimises the exp...

Risk-aware Stochastic Shortest Path

We treat the problem of risk-aware control for stochastic shortest path ...

Hierarchical Constrained Stochastic Shortest Path Planning via Cost Budget Allocation

Stochastic sequential decision making often requires hierarchical struct...

A Theory of Goal-Oriented MDPs with Dead Ends

Stochastic Shortest Path (SSP) MDPs is a problem class widely studied in...

Risk-Averse Decision Making Under Uncertainty

A large class of decision making under uncertainty problems can be descr...

Finding Risk-Averse Shortest Path with Time-dependent Stochastic Costs

In this paper, we tackle the problem of risk-averse route planning in a ...

Stochastic Shortest Path with Energy Constraints in POMDPs

We consider partially observable Markov decision processes (POMDPs) with...