A Method for Speeding Up Value Iteration in Partially Observable Markov Decision Processes

01/23/2013
by   Nevin Lianwen Zhang, et al.
0

We present a technique for speeding up the convergence of value iteration for partially observable Markov decisions processes (POMDPs). The underlying idea is similar to that behind modified policy iteration for fully observable Markov decision processes (MDPs). The technique can be easily incorporated into any existing POMDP value iteration algorithms. Experiments have been conducted on several test problems with one POMDP value iteration algorithm called incremental pruning. We find that the technique can make incremental pruning run several orders of magnitude faster.

READ FULL TEXT

page 1

page 2

page 3

page 4

page 5

page 6

page 7

page 8

research
06/01/2011

Speeding Up the Convergence of Value Iteration in Partially Observable Markov Decision Processes

Partially observable Markov decision processes (POMDPs) have recently be...
research
07/06/2017

Efficient Strategy Iteration for Mean Payoff in Markov Decision Processes

Markov decision processes (MDPs) are standard models for probabilistic s...
research
07/16/2022

ChronosPerseus: Randomized Point-based Value Iteration with Importance Sampling for POSMDPs

In reinforcement learning, agents have successfully used environments mo...
research
11/30/2015

Scaling POMDPs For Selecting Sellers in E-markets-Extended Version

In multiagent e-marketplaces, buying agents need to select good sellers ...
research
07/11/2012

Region-Based Incremental Pruning for POMDPs

We present a major improvement to the incremental pruning algorithm for ...
research
12/12/2012

Polynomial Value Iteration Algorithms for Detrerminstic MDPs

Value iteration is a commonly used and empirically competitive method in...
research
01/29/2021

Optimistic Policy Iteration for MDPs with Acyclic Transient State Structure

We consider Markov Decision Processes (MDPs) in which every stationary p...

Please sign up or login with your details

Forgot password? Click here to reset