LSTM vs. GRU vs. Bidirectional RNN for script generation

08/12/2019
by   Sanidhya Mangal, et al.
0

Scripts are an important part of any TV series. They narrate movements, actions and expressions of characters. In this paper, a case study is presented on how different sequence to sequence deep learning models perform in the task of generating new conversations between characters as well as new scenarios on the basis of a script (previous conversations). A comprehensive comparison between these models, namely, LSTM, GRU and Bidirectional RNN is presented. All the models are designed to learn the sequence of recurring characters from the input sequence. Each input sequence will contain, say "n" characters, and the corresponding targets will contain the same number of characters, except, they will be shifted one character to the right. In this manner, input and output sequences are generated and used to train the models. A closer analysis of explored models performance and efficiency is delineated with the help of graph plots and generated texts by taking some input string. These graphs describe both, intraneural performance and interneural model performance for each model.

READ FULL TEXT

page 3

page 6

research
06/21/2020

The NYU-CUBoulder Systems for SIGMORPHON 2020 Task 0 and Task 2

We describe the NYU-CUBoulder systems for the SIGMORPHON 2020 Task 0 on ...
research
06/23/2018

Stroke-based Character Recognition with Deep Reinforcement Learning

The stroke sequence of characters is significant for the character recog...
research
04/05/2020

Locality Sensitive Hashing-based Sequence Alignment Using Deep Bidirectional LSTM Models

Bidirectional Long Short-Term Memory (LSTM) is a special kind of Recurre...
research
06/30/2022

Computing the Parameterized Burrows–Wheeler Transform Online

Parameterized strings are a generalization of strings in that their char...
research
07/15/2016

Neural Discourse Modeling of Conversations

Deep neural networks have shown recent promise in many language-related ...
research
02/01/2021

Inducing Meaningful Units from Character Sequences with Slot Attention

Characters do not convey meaning, but sequences of characters do. We pro...
research
10/25/2020

Human or Machine? It Is Not What You Write, But How You Write It

Online fraud often involves identity theft. Since most security measures...

Please sign up or login with your details

Forgot password? Click here to reset