Facing off World Model Backbones: RNNs, Transformers, and S4

07/05/2023
by   Fei Deng, et al.
0

World models are a fundamental component in model-based reinforcement learning (MBRL) agents. To perform temporally extended and consistent simulations of the future in partially observable environments, world models need to possess long-term memory. However, state-of-the-art MBRL agents, such as Dreamer, predominantly employ recurrent neural networks (RNNs) as their world model backbone, which have limited memory capacity. In this paper, we seek to explore alternative world model backbones for improving long-term memory. In particular, we investigate the effectiveness of Transformers and Structured State Space Sequence (S4) models, motivated by their remarkable ability to capture long-range dependencies in low-dimensional sequences and their complementary strengths. We propose S4WM, the first S4-based world model that can generate high-dimensional image sequences through latent imagination. Furthermore, we extensively compare RNN-, Transformer-, and S4-based world models across four sets of environments, which we have specifically tailored to assess crucial memory capabilities of world models, including long-term imagination, context-dependent recall, reward prediction, and memory-based reasoning. Our findings demonstrate that S4WM outperforms Transformer-based world models in terms of long-term memory, while exhibiting greater efficiency during training and imagination. These results pave the way for the development of stronger MBRL agents.

READ FULL TEXT

page 6

page 7

page 8

page 10

page 17

page 18

page 19

page 20

research
09/01/2021

∞-former: Infinite Memory Transformer

Transformers struggle when attending to long contexts, since the amount ...
research
06/27/2021

Graph Convolutional Memory for Deep Reinforcement Learning

Solving partially-observable Markov decision processes (POMDPs) is criti...
research
10/24/2022

Evaluating Long-Term Memory in 3D Mazes

Intelligent agents need to remember salient information to reason in par...
research
11/17/2019

Working Memory Graphs

Transformers have increasingly outperformed gated RNNs in obtaining new ...
research
04/12/2021

Updater-Extractor Architecture for Inductive World State Representations

Developing NLP models traditionally involves two stages - training and a...
research
10/25/2020

Attention is All You Need in Speech Separation

Recurrent Neural Networks (RNNs) have long been the dominant architectur...
research
06/20/2019

The trade-off between long-term memory and smoothness for recurrent networks

Training recurrent neural networks (RNNs) that possess long-term memory ...

Please sign up or login with your details

Forgot password? Click here to reset