LSTMVis: A Tool for Visual Analysis of Hidden State Dynamics in Recurrent Neural Networks

06/23/2016
by   Hendrik Strobelt, et al.
0

Recurrent neural networks, and in particular long short-term memory (LSTM) networks, are a remarkably effective tool for sequence modeling that learn a dense black-box hidden representation of their sequential input. Researchers interested in better understanding these models have studied the changes in hidden state representations over time and noticed some interpretable patterns but also significant noise. In this work, we present LSTMVIS, a visual analysis tool for recurrent neural networks with a focus on understanding these hidden state dynamics. The tool allows users to select a hypothesis input range to focus on local state changes, to match these states changes to similar patterns in a large data set, and to align these results with structural annotations from their domain. We show several use cases of the tool for analyzing specific hidden state properties on dataset containing nesting, phrase structure, and chord progressions, and demonstrate how the tool can be used to isolate patterns for further statistical analysis. We characterize the domain, the different stakeholders, and their goals and tasks.

READ FULL TEXT
research
01/12/2017

Simplified Gating in Long Short-term Memory (LSTM) Recurrent Neural Networks

The standard LSTM recurrent neural networks while very powerful in long-...
research
05/22/2018

State-Denoised Recurrent Neural Networks

Recurrent neural networks (RNNs) are difficult to train on sequence proc...
research
11/18/2016

Increasing the Interpretability of Recurrent Neural Networks Using Hidden Markov Models

As deep neural networks continue to revolutionize various application do...
research
04/25/2015

Differential Recurrent Neural Networks for Action Recognition

The long short-term memory (LSTM) neural network is capable of processin...
research
11/28/2017

Visualisation and 'diagnostic classifiers' reveal how recurrent and recursive neural networks process hierarchical structure

We investigate how neural networks can learn and process languages with ...
research
03/08/2021

PyRCN: Exploration and Application of ESNs

As a family member of Recurrent Neural Networks and similar to Long-Shor...
research
02/19/2019

Understanding and Controlling Memory in Recurrent Neural Networks

To be effective in sequential data processing, Recurrent Neural Networks...

Please sign up or login with your details

Forgot password? Click here to reset