Representation Learning of Music Using Artist, Album, and Track Information

06/27/2019
by   Jongpil Lee, et al.
0

Supervised music representation learning has been performed mainly using semantic labels such as music genres. However, annotating music with semantic labels requires time and cost. In this work, we investigate the use of factual metadata such as artist, album, and track information, which are naturally annotated to songs, for supervised music representation learning. The results show that each of the metadata has individual concept characteristics, and using them jointly improves overall performance.

READ FULL TEXT
research
10/18/2017

Representation Learning of Music Using Artist Labels

Recently, feature representation by learning algorithms has drawn great ...
research
09/19/2023

Motif-Centric Representation Learning for Symbolic Music

Music motif, as a conceptual building block of composition, is crucial f...
research
07/16/2021

DoReMi: First glance at a universal OMR dataset

The main challenges of Optical Music Recognition (OMR) come from the nat...
research
06/02/2023

Q A: Query-Based Representation Learning for Multi-Track Symbolic Music re-Arrangement

Music rearrangement is a common music practice of reconstructing and rec...
research
06/13/2022

Self-Supervised Representation Learning With MUlti-Segmental Informational Coding (MUSIC)

Self-supervised representation learning maps high-dimensional data into ...
research
07/24/2022

HouseX: A Fine-grained House Music Dataset and its Potential in the Music Industry

Machine sound classification has been one of the fundamental tasks of mu...
research
08/28/2018

Representation Learning for Image-based Music Recommendation

Image perception is one of the most direct ways to provide contextual in...

Please sign up or login with your details

Forgot password? Click here to reset