Regret Bounds for Markov Decision Processes with Recursive Optimized Certainty Equivalents

01/30/2023
by   Wenhao Xu, et al.
0

The optimized certainty equivalent (OCE) is a family of risk measures that cover important examples such as entropic risk, conditional value-at-risk and mean-variance models. In this paper, we propose a new episodic risk-sensitive reinforcement learning formulation based on tabular Markov decision processes with recursive OCEs. We design an efficient learning algorithm for this problem based on value iteration and upper confidence bound. We derive an upper bound on the regret of the proposed algorithm, and also establish a minimax lower bound. Our bounds show that the regret rate achieved by our proposed algorithm has optimal dependence on the number of episodes and the number of actions.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/03/2022

Square-root regret bounds for continuous-time episodic Markov decision processes

We study reinforcement learning for continuous-time Markov decision proc...
research
02/07/2023

Near-Minimax-Optimal Risk-Sensitive Reinforcement Learning with CVaR

In this paper, we study risk-sensitive Reinforcement Learning (RL), focu...
research
06/04/2023

Regret Bounds for Risk-sensitive Reinforcement Learning with Lipschitz Dynamic Risk Measures

We study finite episodic Markov decision processes incorporating dynamic...
research
06/07/2022

Concentration bounds for SSP Q-learning for average cost MDPs

We derive a concentration bound for a Q-learning algorithm for average c...
research
03/14/2019

Contextual Markov Decision Processes using Generalized Linear Models

We consider the recently proposed reinforcement learning (RL) framework ...
research
05/11/2018

Stochastic Approximation for Risk-aware Markov Decision Processes

In this paper, we develop a stochastic approximation type algorithm to s...
research
10/02/2019

Optimistic Value Iteration

Markov decision processes are widely used for planning and verification ...

Please sign up or login with your details

Forgot password? Click here to reset