Learning to Embed Probabilistic Structures Between Deterministic Chaos and Random Process in a Variational Bayes Predictive-Coding RNN

11/04/2018
by   Ahmadreza Ahmadi, et al.
0

This study introduces a stochastic predictive-coding RNN model that can learn to extract probabilistic structures hidden in fluctuating temporal patterns by dynamically changing the uncertainty of latent variables. The learning process of the model involves maximizing the lower bound on the marginal likelihood of the sequential data, which consists of two terms. The first one is the expectation of prediction errors and the second one is the divergence of the prior and the approximated posterior. The main focus in the current study is to examine how weighting of the second term during learning affects the way of internally representing the uncertainty hidden in the sequence data. The simulation experiment on learning a simple probabilistic finite state machine demonstrates that the estimation of uncertainty in the latent variable approaches zero at each time step and that the network imitates the probabilistic structure of the target sequences by developing deterministic chaos in the case of the high weighting. On the contrary, in the case of the low weighting, the estimate of uncertainty increases significantly because of developing a random process in the network. The analysis shows that generalization in learning is most successful between these two extremes. Qualitatively, the same property has been observed in a trail of learning more complex sequence data consisting of probabilistic transitions between a set of hand-drawn primitive patterns using the model extended with hierarchy.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/15/2017

Z-Forcing: Training Stochastic Recurrent Networks

Many efforts have been devoted to training generative latent variable mo...
research
05/27/2020

Goal-Directed Planning for Habituated Agents by Active Inference Using a Variational Recurrent Neural Network

It is crucial to ask how agents can achieve goals by generating action p...
research
04/19/2023

Martingale Posterior Neural Processes

A Neural Process (NP) estimates a stochastic process implicitly defined ...
research
09/21/2023

Variational Connectionist Temporal Classification for Order-Preserving Sequence Modeling

Connectionist temporal classification (CTC) is commonly adopted for sequ...
research
02/24/2020

Variational Hyper RNN for Sequence Modeling

In this work, we propose a novel probabilistic sequence model that excel...
research
08/08/2017

From Deterministic to Generative: Multi-Modal Stochastic RNNs for Video Captioning

Video captioning in essential is a complex natural process, which is aff...

Please sign up or login with your details

Forgot password? Click here to reset