Random Actions vs Random Policies: Bootstrapping Model-Based Direct Policy Search

10/21/2022
by   Elias Hanna, et al.
0

This paper studies the impact of the initial data gathering method on the subsequent learning of a dynamics model. Dynamics models approximate the true transition function of a given task, in order to perform policy search directly on the model rather than on the costly real system. This study aims to determine how to bootstrap a model as efficiently as possible, by comparing initialization methods employed in two different policy search frameworks in the literature. The study focuses on the model performance under the episode-based framework of Evolutionary methods using probabilistic ensembles. Experimental results show that various task-dependant factors can be detrimental to each method, suggesting to explore hybrid approaches.

READ FULL TEXT
research
09/09/2019

Gradient-Aware Model-based Policy Search

Traditional model-based reinforcement learning approaches learn a model ...
research
10/26/2021

Learning Robust Controllers Via Probabilistic Model-Based Policy Search

Model-based Reinforcement Learning estimates the true environment throug...
research
08/29/2019

A Queuing Approach to Parking: Modeling, Verification, and Prediction

We present a queuing model of parking dynamics and a model-based predict...
research
05/23/2016

Learning and Policy Search in Stochastic Dynamical Systems with Bayesian Neural Networks

We present an algorithm for model-based reinforcement learning that comb...
research
11/15/2018

Woulda, Coulda, Shoulda: Counterfactually-Guided Policy Search

Learning policies on data synthesized by models can in principle quench ...
research
03/09/2021

A model-based framework for learning transparent swarm behaviors

This paper proposes a model-based framework to automatically and efficie...
research
07/05/2018

A Boo(n) for Evaluating Architecture Performance

We point out important problems with the common practice of using the be...

Please sign up or login with your details

Forgot password? Click here to reset