MIDI-Draw: Sketching to Control Melody Generation

05/19/2023
by   Tashi Namgyal, et al.
0

We describe a proof-of-principle implementation of a system for drawing melodies that abstracts away from a note-level input representation via melodic contours. The aim is to allow users to express their musical intentions without requiring prior knowledge of how notes fit together melodiously. Current approaches to controllable melody generation often require users to choose parameters that are static across a whole sequence, via buttons or sliders. In contrast, our method allows users to quickly specify how parameters should change over time by drawing a contour.

READ FULL TEXT
research
08/16/2021

Autocomplete Repetitive Stroking with Image Guidance

Image-guided drawing can compensate for the lack of skills but often req...
research
08/04/2020

Music SketchNet: Controllable Music Generation via Factorized Representations of Pitch and Rhythm

Drawing an analogy with automatic image completion systems, we propose M...
research
04/26/2021

dualFace:Two-Stage Drawing Guidance for Freehand Portrait Sketching

In this paper, we propose dualFace, a portrait drawing interface to assi...
research
04/07/2022

Expressive Singing Synthesis Using Local Style Token and Dual-path Pitch Encoder

This paper proposes a controllable singing voice synthesis system capabl...
research
01/02/2018

Deep Learning for Identifying Potential Conceptual Shifts for Co-creative Drawing

We present a system for identifying conceptual shifts between visual cat...
research
07/14/2023

Flow-Guided Controllable Line Drawing Generation

In this paper, we investigate the problem of automatically controllable ...
research
10/10/2020

Drawing with AI – Exploring Collaborative Inking Experiences Based on Mid-air Pointing and Reinforcement Learning

Digitalization is changing the nature of tools and materials, which are ...

Please sign up or login with your details

Forgot password? Click here to reset