DeepAI AI Chat
Log In Sign Up

A-Muze-Net: Music Generation by Composing the Harmony based on the Generated Melody

11/25/2021
by   Or Goren, et al.
0

We present a method for the generation of Midi files of piano music. The method models the right and left hands using two networks, where the left hand is conditioned on the right hand. This way, the melody is generated before the harmony. The Midi is represented in a way that is invariant to the musical scale, and the melody is represented, for the purpose of conditioning the harmony, by the content of each bar, viewed as a chord. Finally, notes are added randomly, based on this chord representation, in order to enrich the generated audio. Our experiments show a significant improvement over the state of the art for training on such datasets, and demonstrate the contribution of each of the novel components.

READ FULL TEXT
05/09/2020

Dual-track Music Generation using Deep Learning

Music generation is always interesting in a sense that there is no forma...
12/07/2020

Multi-Instrumentalist Net: Unsupervised Generation of Music from Body Movements

We propose a novel system that takes as an input body movements of a mus...
10/29/2018

Enabling Factorized Piano Music Modeling and Generation with the MAESTRO Dataset

Generating musical audio directly with neural networks is notoriously di...
01/30/2023

SingSong: Generating musical accompaniments from singing

We present SingSong, a system that generates instrumental music to accom...
01/11/2023

WuYun: Exploring hierarchical skeleton-guided melody generation using knowledge-enhanced deep learning

Although deep learning has revolutionized music generation, existing met...
05/21/2021

LoopNet: Musical Loop Synthesis Conditioned On Intuitive Musical Parameters

Loops, seamlessly repeatable musical segments, are a cornerstone of mode...
11/19/2022

EDGE: Editable Dance Generation From Music

Dance is an important human art form, but creating new dances can be dif...