Evaluating language models of tonal harmony

06/22/2018
by   David R. W. Sears, et al.
0

This study borrows and extends probabilistic language models from natural language processing to discover the syntactic properties of tonal harmony. Language models come in many shapes and sizes, but their central purpose is always the same: to predict the next event in a sequence of letters, words, notes, or chords. However, few studies employing such models have evaluated the most state-of-the-art architectures using a large-scale corpus of Western tonal music, instead preferring to use relatively small datasets containing chord annotations from contemporary genres like jazz, pop, and rock. Using symbolic representations of prominent instrumental genres from the common-practice period, this study applies a flexible, data-driven encoding scheme to (1) evaluate Finite Context (or n-gram) models and Recurrent Neural Networks (RNNs) in a chord prediction task; (2) compare predictive accuracy from the best-performing models for chord onsets from each of the selected datasets; and (3) explain differences between the two model architectures in a regression analysis. We find that Finite Context models using the Prediction by Partial Match (PPM) algorithm outperform RNNs, particularly for the piano datasets, with the regression model suggesting that RNNs struggle with particularly rare chord types.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
04/01/2021

Low-Resource Language Modelling of South African Languages

Language models are the foundation of current neural network-based model...
research
04/05/2018

A Large-Scale Study of Language Models for Chord Prediction

We conduct a large-scale study of language models for chord prediction. ...
research
06/17/2019

Tabula nearly rasa: Probing the Linguistic Knowledge of Character-Level Neural Language Models Trained on Unsegmented Text

Recurrent neural networks (RNNs) have reached striking performance in ma...
research
02/02/2022

Language Models Explain Word Reading Times Better Than Empirical Predictability

Though there is a strong consensus that word length and frequency are th...
research
02/02/2015

Scaling Recurrent Neural Network Language Models

This paper investigates the scaling properties of Recurrent Neural Netwo...
research
03/21/2022

Better Language Model with Hypernym Class Prediction

Class-based language models (LMs) have been long devised to address cont...
research
10/07/2022

Novice Type Error Diagnosis with Natural Language Models

Strong static type systems help programmers eliminate many errors withou...

Please sign up or login with your details

Forgot password? Click here to reset