Learning is planning: near Bayes-optimal reinforcement learning via Monte-Carlo tree search

02/14/2012
by   John Asmuth, et al.
0

Bayes-optimal behavior, while well-defined, is often difficult to achieve. Recent advances in the use of Monte-Carlo tree search (MCTS) have shown that it is possible to act near-optimally in Markov Decision Processes (MDPs) with very large or infinite state spaces. Bayes-optimal behavior in an unknown MDP is equivalent to optimal behavior in the known belief-space MDP, although the size of this belief-space MDP grows exponentially with the amount of history retained, and is potentially infinite. We show how an agent can use one particular MCTS algorithm, Forward Search Sparse Sampling (FSSS), in an efficient way to act nearly Bayes-optimally for all but a polynomial number of steps, assuming that FSSS can be used to act efficiently in any possible underlying MDP.

READ FULL TEXT
research
03/15/2014

Near-optimal Reinforcement Learning in Factored MDPs

Any reinforcement learning algorithm that applies to all Markov decision...
research
03/13/2018

Active Reinforcement Learning with Monte-Carlo Tree Search

Active Reinforcement Learning (ARL) is a twist on RL where the agent obs...
research
02/10/2021

Risk-Averse Bayes-Adaptive Reinforcement Learning

In this work, we address risk-averse Bayesadaptive reinforcement learnin...
research
05/14/2012

Efficient Bayes-Adaptive Reinforcement Learning using Sample-Based Search

Bayesian model-based reinforcement learning is a formally elegant approa...
research
07/30/2021

An Extensible and Modular Design and Implementation of Monte Carlo Tree Search for the JVM

Flexible implementations of Monte Carlo Tree Search (MCTS), combined wit...
research
11/01/2019

Generalized Mean Estimation in Monte-Carlo Tree Search

We consider Monte-Carlo Tree Search (MCTS) applied to Markov Decision Pr...
research
01/09/2019

Robust and Adaptive Planning under Model Uncertainty

Planning under model uncertainty is a fundamental problem across many ap...

Please sign up or login with your details

Forgot password? Click here to reset