On Solving a Stochastic Shortest-Path Markov Decision Process as Probabilistic Inference

09/13/2021
by   Mohamed Baioumy, et al.
43

Previous work on planning as active inference addresses finite horizon problems and solutions valid for online planning. We propose solving the general Stochastic Shortest-Path Markov Decision Process (SSP MDP) as probabilistic inference. Furthermore, we discuss online and offline methods for planning under uncertainty. In an SSP MDP, the horizon is indefinite and unknown a priori. SSP MDPs generalize finite and infinite horizon MDPs and are widely used in the artificial intelligence community. Additionally, we highlight some of the differences between solving an MDP using dynamic programming approaches widely used in the artificial intelligence community and approaches used in the active inference community.

READ FULL TEXT
research
07/11/2012

Metrics for Finite Markov Decision Processes

We present metrics for measuring the similarity of states in a finite Ma...
research
10/16/2012

A Theory of Goal-Oriented MDPs with Dead Ends

Stochastic Shortest Path (SSP) MDPs is a problem class widely studied in...
research
04/24/2018

Computational Approaches for Stochastic Shortest Path on Succinct MDPs

We consider the stochastic shortest path (SSP) problem for succinct Mark...
research
04/21/2022

The variance-penalized stochastic shortest path problem

The stochastic shortest path problem (SSPP) asks to resolve the non-dete...
research
01/10/2013

Robust Combination of Local Controllers

Planning problems are hard, motion planning, for example, isPSPACE-hard....
research
12/08/2020

Minimax Regret Optimisation for Robust Planning in Uncertain Markov Decision Processes

The parameters for a Markov Decision Process (MDP) often cannot be speci...
research
07/04/2012

Existence and Finiteness Conditions for Risk-Sensitive Planning: Results and Conjectures

Decision-theoretic planning with risk-sensitive planning objectives is i...

Please sign up or login with your details

Forgot password? Click here to reset