Polyphonic Music Generation by Modeling Temporal Dependencies Using a RNN-DBN

12/26/2014
by   Kratarth Goel, et al.
0

In this paper, we propose a generic technique to model temporal dependencies and sequences using a combination of a recurrent neural network and a Deep Belief Network. Our technique, RNN-DBN, is an amalgamation of the memory state of the RNN that allows it to provide temporal information and a multi-layer DBN that helps in high level representation of the data. This makes RNN-DBNs ideal for sequence generation. Further, the use of a DBN in conjunction with the RNN makes this model capable of significantly more complex data representation than an RBM. We apply this technique to the task of polyphonic music generation.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/19/2017

Interactive Music Generation with Positional Constraints using Anticipation-RNNs

Recurrent Neural Networks (RNNS) are now widely used on sequence generat...
research
06/27/2012

Modeling Temporal Dependencies in High-Dimensional Sequences: Application to Polyphonic Music Generation and Transcription

We investigate the problem of modeling symbolic sequences of polyphonic ...
research
12/18/2014

Learning Temporal Dependencies in Data Using a DBN-BLSTM

Since the advent of deep learning, it has been used to solve various pro...
research
05/23/2016

Generative Choreography using Deep Learning

Recent advances in deep learning have enabled the extraction of high-lev...
research
11/25/2018

Deep RNN Framework for Visual Sequential Applications

Extracting temporal and representation features efficiently plays a pivo...
research
11/18/2019

Action Anticipation with RBF KernelizedFeature Mapping RNN

We introduce a novel Recurrent Neural Network-based algorithm for future...
research
11/18/2019

Action Anticipation with RBF Kernelized Feature Mapping RNN

We introduce a novel Recurrent Neural Network-based algorithm for future...

Please sign up or login with your details

Forgot password? Click here to reset