tensorflow-rnn-shakespeare
Code from the "Tensorflow and deep learning - without a PhD, Part 2" session on Recurrent Neural Networks.
view repo
We present a simple regularization technique for Recurrent Neural Networks (RNNs) with Long Short-Term Memory (LSTM) units. Dropout, the most successful technique for regularizing neural networks, does not work well with RNNs and LSTMs. In this paper, we show how to correctly apply dropout to LSTMs, and show that it substantially reduces overfitting on a variety of tasks. These tasks include language modeling, speech recognition, image caption generation, and machine translation.
READ FULL TEXTCode from the "Tensorflow and deep learning - without a PhD, Part 2" session on Recurrent Neural Networks.
A couple of scripts to illustrate how to do CNNs and RNNs in PyTorch
This repository provides scripts to train an LSTM and then extract states from it in Tensorflow.
Solving Question-Answering Problem Using Deep Learning
None