Learning Musical Relations using Gated Autoencoders

08/17/2017
by   Stefan Lattner, et al.
0

Music is usually highly structured and it is still an open question how to design models which can successfully learn to recognize and represent musical structure. A fundamental problem is that structurally related patterns can have very distinct appearances, because the structural relationships are often based on transformations of musical material, like chromatic or diatonic transposition, inversion, retrograde, or rhythm change. In this preliminary work, we study the potential of two unsupervised learning techniques - Restricted Boltzmann Machines (RBMs) and Gated Autoencoders (GAEs) - to capture pre-defined transformations from constructed data pairs. We evaluate the models by using the learned representations as inputs in a discriminative task where for a given type of transformation (e.g. diatonic transposition), the specific relation between two musical patterns must be recognized (e.g. an upward transposition of diatonic steps). Furthermore, we measure the reconstruction error of models when reconstructing musical transformed patterns. Lastly, we test the models in an analogy-making task. We find that it is difficult to learn musical transformations with the RBM and that the GAE is much more adequate for this task, since it is able to learn representations of specific transformations that are largely content-invariant. We believe these results show that models such as GAEs may provide the basis for more encompassing music analysis systems, by endowing them with a better understanding of the structures underlying music.

READ FULL TEXT

page 2

page 5

page 6

page 14

research
01/06/2020

Modeling Musical Structure with Artificial Neural Networks

In recent years, artificial neural networks (ANNs) have become a univers...
research
01/26/2022

Understanding and Compressing Music with Maximal Transformable Patterns

We present a polynomial-time algorithm that discovers all maximal patter...
research
06/14/2017

Learning and Evaluating Musical Features with Deep Autoencoders

In this work we describe and evaluate methods to learn musical embedding...
research
08/02/2019

High-Level Control of Drum Track Generation Using Learned Patterns of Rhythmic Interaction

Spurred by the potential of deep learning, computational music generatio...
research
08/22/2018

Predicting Musical Sophistication from Music Listening Behaviors: A Preliminary Study

Psychological models are increasingly being used to explain online behav...
research
12/14/2016

Imposing higher-level Structure in Polyphonic Music Generation using Convolutional Restricted Boltzmann Machines and Constraints

We introduce a method for imposing higher-level structure on generated, ...
research
11/12/2019

Using musical relationships between chord labels in automatic chord extraction tasks

Recent researches on Automatic Chord Extraction (ACE) have focused on th...

Please sign up or login with your details

Forgot password? Click here to reset