Playing Minecraft with Behavioural Cloning

05/07/2020
by   Anssi Kanervisto, et al.
0

MineRL 2019 competition challenged participants to train sample-efficient agents to play Minecraft, by using a dataset of human gameplay and a limit number of steps the environment. We approached this task with behavioural cloning by predicting what actions human players would take, and reached fifth place in the final ranking. Despite being a simple algorithm, we observed the performance of such an approach can vary significantly, based on when the training is stopped. In this paper, we detail our submission to the competition, run further experiments to study how performance varied over training and study how different engineering decisions affected these results.

READ FULL TEXT
research
10/22/2018

Human-Competitive Awards 2018

Report on Humies competition at GECCO 2018 in Japan...
research
10/29/2015

My Reflections on the First Man vs. Machine No-Limit Texas Hold 'em Competition

The first ever human vs. computer no-limit Texas hold 'em competition to...
research
06/09/2010

Virtual information system on working area

In order to get strategic positioning for competition in business organi...
research
06/03/2021

Deceptive Level Generation for Angry Birds

The Angry Birds AI competition has been held over many years to encourag...
research
10/04/2019

Deep Q-Network for Angry Birds

Angry Birds is a popular video game in which the player is provided with...
research
01/26/2021

The MineRL 2020 Competition on Sample Efficient Reinforcement Learning using Human Priors

Although deep reinforcement learning has led to breakthroughs in many di...
research
01/26/2021

The Probabilistic Final Standing Calculator: a fair stochastic tool to handle abruptly stopped football seasons

The COVID-19 pandemic has left its marks in the sports world, forcing th...

Please sign up or login with your details

Forgot password? Click here to reset