Variational Temporal Abstraction

by   Taesup Kim, et al.

We introduce a variational approach to learning and inference of temporally hierarchical structure and representation for sequential data. We propose the Variational Temporal Abstraction (VTA), a hierarchical recurrent state space model that can infer the latent temporal structure and thus perform the stochastic state transition hierarchically. We also propose to apply this model to implement the jumpy-imagination ability in imagination-augmented agent-learning in order to improve the efficiency of the imagination. In experiments, we demonstrate that our proposed method can model 2D and 3D visual sequence datasets with interpretable temporal structure discovery and that its application to jumpy imagination enables more efficient agent-learning in a 3D navigation task.


page 7

page 8


Context-Specific Representation Abstraction for Deep Option Learning

Hierarchical reinforcement learning has focused on discovering temporall...

Hierarchical Recurrent Filtering for Fully Convolutional DenseNets

Generating a robust representation of the environment is a crucial abili...

A temporally abstracted Viterbi algorithm

Hierarchical problem abstraction, when applicable, may offer exponential...

A Group-Theoretic Approach to Abstraction: Hierarchical, Interpretable, and Task-Free Clustering

Abstraction plays a key role in concept learning and knowledge discovery...

Variational Predictive Routing with Nested Subjective Timescales

Discovery and learning of an underlying spatiotemporal hierarchy in sequ...

Deep Variational Bayes Filters: Unsupervised Learning of State Space Models from Raw Data

We introduce Deep Variational Bayes Filters (DVBF), a new method for uns...

Learning deep autoregressive models for hierarchical data

We propose a model for hierarchical structured data as an extension to t...

Please sign up or login with your details

Forgot password? Click here to reset