Multilabel Automated Recognition of Emotions Induced Through Music

05/29/2019
by   Fabio Paolizzo, et al.
0

Achieving advancements in automatic recognition of emotions that music can induce require considering multiplicity and simultaneity of emotions. Comparison of different machine learning algorithms performing multilabel and multiclass classification is the core of our work. The study analyzes the implementation of the Geneva Emotional Music Scale 9 in the Emotify music dataset and the data distribution. The research goal is the identification of best methods towards the definition of the audio component of a new a new multimodal dataset for music emotion recognition.

READ FULL TEXT
12/09/2021

Personalized musically induced emotions of not-so-popular Colombian music

This work presents an initial proof of concept of how Music Emotion Reco...
12/16/2021

EmotionBox: a music-element-driven emotional music generation system using Recurrent Neural Network

With the development of deep neural networks, automatic music compositio...
02/01/2021

Neural Network architectures to classify emotions in Indian Classical Music

Music is often considered as the language of emotions. It has long been ...
09/12/2019

The emotions that we perceive in music: the influence of language and lyrics comprehension on agreement

In the present study, we address the relationship between the emotions p...
11/30/2017

Enabling Embodied Analogies in Intelligent Music Systems

The present methodology is aimed at cross-modal machine learning and use...
06/04/2021

Musical Prosody-Driven Emotion Classification: Interpreting Vocalists Portrayal of Emotions Through Machine Learning

The task of classifying emotions within a musical track has received wid...
03/29/2018

An empirical approach to the relationship between emotion and music production quality

In music production, the role of the mix engineer is to take recorded mu...