Downbeat Tracking with Tempo-Invariant Convolutional Neural Networks

02/03/2021
by   Bruno Di Giorgi, et al.
0

The human ability to track musical downbeats is robust to changes in tempo, and it extends to tempi never previously encountered. We propose a deterministic time-warping operation that enables this skill in a convolutional neural network (CNN) by allowing the network to learn rhythmic patterns independently of tempo. Unlike conventional deep learning approaches, which learn rhythmic patterns at the tempi present in the training dataset, the patterns learned in our model are tempo-invariant, leading to better tempo generalisation and more efficient usage of the network capacity. We test the generalisation property on a synthetic dataset created by rendering the Groove MIDI Dataset using FluidSynth, split into a training set containing the original performances and a test set containing tempo-scaled versions rendered with different SoundFonts (test-time augmentation). The proposed model generalises nearly perfectly to unseen tempi (F-measure of 0.89 on both training and test sets), whereas a comparable conventional CNN achieves similar accuracy only for the training set (0.89) and drops to 0.54 on the test set. The generalisation advantage of the proposed model extends to real music, as shown by results on the GTZAN and Ballroom datasets.

READ FULL TEXT

page 1

page 2

page 3

page 4

01/27/2019

A Convolutional Neural Network model based on Neutrosophy for Noisy Speech Recognition

Convolutional neural networks are sensitive to unknown noisy condition i...
05/14/2021

Facial Age Estimation using Convolutional Neural Networks

This paper is a part of a student project in Machine Learning at the Nor...
08/19/2019

Human uncertainty makes classification more robust

The classification performance of deep neural networks has begun to asym...
11/06/2018

DeepChannel: Salience Estimation by Contrastive Learning for Extractive Document Summarization

We propose DeepChannel, a robust, data-efficient, and interpretable neur...
11/18/2018

DeepConsensus: using the consensus of features from multiple layers to attain robust image classification

We consider a classifier whose test set is exposed to various perturbati...
07/24/2014

Convolutional Neural Associative Memories: Massive Capacity with Noise Tolerance

The task of a neural associative memory is to retrieve a set of previous...
07/18/2022

Interpolation, extrapolation, and local generalization in common neural networks

There has been a long history of works showing that neural networks have...