How do Mixture Density RNNs Predict the Future?

01/23/2019
by   Kai Olav Ellefsen, et al.
16

Gaining a better understanding of how and what machine learning systems learn is important to increase confidence in their decisions and catalyze further research. In this paper, we analyze the predictions made by a specific type of recurrent neural network, mixture density RNNs (MD-RNNs). These networks learn to model predictions as a combination of multiple Gaussian distributions, making them particularly interesting for problems where a sequence of inputs may lead to several distinct future possibilities. An example is learning internal models of an environment, where different events may or may not occur, but where the average over different events is not meaningful. By analyzing the predictions made by trained MD-RNNs, we find that their different Gaussian components have two complementary roles: 1) Separately modeling different stochastic events and 2) Separately modeling scenarios governed by different rules. These findings increase our understanding of what is learned by predictive MD-RNNs, and open up new research directions for further understanding how we can benefit from their self-organizing model decomposition.

READ FULL TEXT

page 2

page 3

page 5

page 6

page 7

research
06/08/2016

Improving Recurrent Neural Networks For Sequence Labelling

In this paper we study different types of Recurrent Neural Networks (RNN...
research
05/24/2019

Tiresias: Predicting Security Events Through Deep Learning

With the increased complexity of modern computer attacks, there is a nee...
research
02/08/2016

Predicting Clinical Events by Combining Static and Dynamic Information Using Recurrent Neural Networks

In clinical data sets we often find static information (e.g. patient gen...
research
04/13/2018

Neural Trajectory Analysis of Recurrent Neural Network In Handwriting Synthesis

Recurrent neural networks (RNNs) are capable of learning to generate hig...
research
09/25/2017

Predictive-State Decoders: Encoding the Future into Recurrent Networks

Recurrent neural networks (RNNs) are a vital modeling technique that rel...
research
05/31/2021

Learning and Generalization in RNNs

Simple recurrent neural networks (RNNs) and their more advanced cousins ...
research
01/03/2021

Recoding latent sentence representations – Dynamic gradient-based activation modification in RNNs

In Recurrent Neural Networks (RNNs), encoding information in a suboptima...

Please sign up or login with your details

Forgot password? Click here to reset