Musical Word Embedding: Bridging the Gap between Listening Contexts and Music

07/23/2020
by   Seungheon Doh, et al.
0

Word embedding pioneered by Mikolov et al. is a staple technique for word representations in natural language processing (NLP) research which has also found popularity in music information retrieval tasks. Depending on the type of text data for word embedding, however, vocabulary size and the degree of musical pertinence can significantly vary. In this work, we (1) train the distributed representation of words using combinations of both general text data and music-specific data and (2) evaluate the system in terms of how they associate listening contexts with musical compositions.

READ FULL TEXT
research
09/15/2021

Fast Extraction of Word Embedding from Q-contexts

The notion of word embedding plays a fundamental role in natural languag...
research
01/11/2018

Enhancing Translation Language Models with Word Embedding for Information Retrieval

In this paper, we explore the usage of Word Embedding semantic resources...
research
01/25/2020

An Analysis of Word2Vec for the Italian Language

Word representation is fundamental in NLP tasks, because it is precisely...
research
11/01/2021

Learning To Generate Piano Music With Sustain Pedals

Recent years have witnessed a growing interest in research related to th...
research
10/16/2020

PiRhDy: Learning Pitch-, Rhythm-, and Dynamics-aware Embeddings for Symbolic Music

Definitive embeddings remain a fundamental challenge of computational mu...
research
09/07/2017

Composition by Conversation

Most musical programming languages are developed purely for coding virtu...
research
01/16/2019

It's Only Words And Words Are All I Have

The central idea of this paper is to demonstrate the strength of lyrics ...

Please sign up or login with your details

Forgot password? Click here to reset