Sparsely Changing Latent States for Prediction and Planning in Partially Observable Domains

10/29/2021
by   Christian Gumbsch, et al.
4

A common approach to prediction and planning in partially observable domains is to use recurrent neural networks (RNNs), which ideally develop and maintain a latent memory about hidden, task-relevant factors. We hypothesize that many of these hidden factors in the physical world are constant over time, changing only sparsely. Accordingly, we propose Gated L_0 Regularized Dynamics (GateL0RD), a novel recurrent architecture that incorporates the inductive bias to maintain stable, sparsely changing latent states. The bias is implemented by means of a novel internal gating function and a penalty on the L_0 norm of latent state changes. We demonstrate that GateL0RD can compete with or outperform state-of-the-art RNNs in a variety of partially observable prediction and control tasks. GateL0RD tends to encode the underlying generative factors of the environment, ignores spurious temporal dependencies, and generalizes better, improving sampling efficiency and prediction accuracy as well as behavior in model-based planning and reinforcement learning tasks. Moreover, we show that the developing latent states can be easily interpreted, which is a step towards better explainability in RNNs.

READ FULL TEXT

page 6

page 10

page 18

page 21

page 24

page 27

page 29

research
12/30/2018

Comparison between DeepESNs and gated RNNs on multivariate time-series prediction

We propose an experimental comparison between Deep Echo State Networks (...
research
05/30/2019

Particle Filter Recurrent Neural Networks

Recurrent neural networks (RNNs) have been extraordinarily successful fo...
research
08/06/2022

Recurrent networks, hidden states and beliefs in partially observable environments

Reinforcement learning aims to learn optimal policies from interaction w...
research
06/06/2018

Deep Variational Reinforcement Learning for POMDPs

Many real-world sequential decision making problems are partially observ...
research
09/25/2017

Predictive-State Decoders: Encoding the Future into Recurrent Networks

Recurrent neural networks (RNNs) are a vital modeling technique that rel...
research
03/13/2023

Unsupervised Representation Learning in Partially Observable Atari Games

State representation learning aims to capture latent factors of an envir...
research
11/17/2019

Working Memory Graphs

Transformers have increasingly outperformed gated RNNs in obtaining new ...

Please sign up or login with your details

Forgot password? Click here to reset