Working Memory Graphs

11/17/2019
by   Ricky Loynd, et al.
0

Transformers have increasingly outperformed gated RNNs in obtaining new state-of-the-art results on supervised tasks involving text sequences. Inspired by this trend, we study the question of how Transformer-based models can improve the performance of sequential decision-making agents. We present the Working Memory Graph (WMG), an agent that employs multi-head self-attention to reason over a dynamic set of vectors representing observed and recurrent state. We evaluate WMG in two partially observable environments, one that requires complex reasoning over past observations, and another that features factored observations. We find that WMG significantly outperforms gated RNNs on these tasks, supporting the hypothesis that WMG's inductive bias in favor of learning and leveraging factored representations can dramatically boost sample efficiency in environments featuring such structure.

READ FULL TEXT

page 2

page 4

page 7

research
10/13/2019

Stabilizing Transformers for Reinforcement Learning

Owing to their ability to both effectively integrate information over lo...
research
07/05/2023

Facing off World Model Backbones: RNNs, Transformers, and S4

World models are a fundamental component in model-based reinforcement le...
research
03/20/2018

GaAN: Gated Attention Networks for Learning on Large and Spatiotemporal Graphs

We propose a new network architecture, Gated Attention Networks (GaAN), ...
research
07/10/2018

Universal Transformers

Self-attentive feed-forward sequence models have been shown to achieve i...
research
07/09/2020

Attention or memory? Neurointerpretable agents in space and time

In neuroscience, attention has been shown to bidirectionally interact wi...
research
10/29/2021

Sparsely Changing Latent States for Prediction and Planning in Partially Observable Domains

A common approach to prediction and planning in partially observable dom...
research
01/17/2020

A Robust Model of Gated Working Memory

Gated working memory is defined as the capacity of holding arbitrary inf...

Please sign up or login with your details

Forgot password? Click here to reset