Verification of Markov Decision Processes with Risk-Sensitive Measures

02/28/2018
by   Murat Cubuktepe, et al.
0

We develop a method for computing policies in Markov decision processes with risk-sensitive measures subject to temporal logic constraints. Specifically, we use a particular risk-sensitive measure from cumulative prospect theory, which has been previously adopted in psychology and economics. The nonlinear transformation of the probabilities and utility functions yields a nonlinear programming problem, which makes computation of optimal policies typically challenging. We show that this nonlinear weighting function can be accurately approximated by the difference of two convex functions. This observation enables efficient policy computation using convex-concave programming. We demonstrate the effectiveness of the approach on several scenarios.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/04/2020

Constrained Risk-Averse Markov Decision Processes

We consider the problem of designing policies for Markov decision proces...
research
07/04/2019

Markov Decision Processes under Ambiguity

We consider statistical Markov Decision Processes where the decision mak...
research
07/09/2019

A Scheme for Dynamic Risk-Sensitive Sequential Decision Making

We present a scheme for sequential decision making with a risk-sensitive...
research
02/14/2012

Iterated risk measures for risk-sensitive Markov decision processes with discounted cost

We demonstrate a limitation of discounted expected utility, a standard a...
research
02/26/2018

Optimizing over a Restricted Policy Class in Markov Decision Processes

We address the problem of finding an optimal policy in a Markov decision...
research
03/29/2017

Optimal Policies for Observing Time Series and Related Restless Bandit Problems

The trade-off between the cost of acquiring and processing data, and unc...
research
01/21/2022

On probability-raising causality in Markov decision processes

The purpose of this paper is to introduce a notion of causality in Marko...

Please sign up or login with your details

Forgot password? Click here to reset