NONOTO: A Model-agnostic Web Interface for Interactive Music Composition by Inpainting

07/23/2019
by   Théis Bazin, et al.
0

Inpainting-based generative modeling allows for stimulating human-machine interactions by letting users perform stylistically coherent local editions to an object using a statistical model. We present NONOTO, a new interface for interactive music generation based on inpainting models. It is aimed both at researchers, by offering a simple and flexible API allowing them to connect their own models with the interface, and at musicians by providing industry-standard features such as audio playback, real-time MIDI output and straightforward synchronization with DAWs using Ableton Link.

READ FULL TEXT

page 1

page 2

page 3

research
07/13/2021

The Piano Inpainting Application

Autoregressive models are now capable of generating high-quality minute-...
research
04/15/2021

Spectrogram Inpainting for Interactive Generation of Instrument Sounds

Modern approaches to sound synthesis using deep neural networks are hard...
research
11/16/2016

Guidefill: GPU Accelerated, Artist Guided Geometric Inpainting for 3D Conversion

The conversion of traditional film into stereo 3D has become an importan...
research
07/10/2023

VampNet: Music Generation via Masked Acoustic Token Modeling

We introduce VampNet, a masked acoustic token modeling approach to music...
research
07/04/2019

Neural Drum Machine : An Interactive System for Real-time Synthesis of Drum Sounds

In this work, we introduce a system for real-time generation of drum sou...
research
10/24/2019

Vision-Infused Deep Audio Inpainting

Multi-modality perception is essential to develop interactive intelligen...
research
10/04/2020

The Accordiatron: A MIDI Controller For Interactive Music

The Accordiatron is a new MIDI controller for real-time performance base...

Please sign up or login with your details

Forgot password? Click here to reset