Decoding and Visualising Intended Emotion in an Expressive Piano Performance

03/03/2023
by   Shreyan Chowdhury, et al.
0

Expert musicians can mould a musical piece to convey specific emotions that they intend to communicate. In this paper, we place a mid-level features based music emotion model in this performer-to-listener communication scenario, and demonstrate via a small visualisation music emotion decoding in real time. We also extend the existing set of mid-level features using analogues of perceptual speed and perceived dynamics.

READ FULL TEXT
research
07/28/2021

On Perceived Emotion in Expressive Piano Performance: Further Experimental Evidence for the Relevance of Mid-level Perceptual Features

Despite recent advances in audio content-based music emotion recognition...
research
07/08/2019

Towards Explainable Music Emotion Recognition: The Route via Mid-level Features

Emotional aspects play an important part in our interaction with music. ...
research
06/13/2018

A data-driven approach to mid-level perceptual musical feature modeling

Musical features and descriptors could be coarsely divided into three le...
research
05/24/2022

Singer Identification for Metaverse with Timbral and Middle-Level Perceptual Features

Metaverse is an interactive world that combines reality and virtuality, ...
research
02/26/2021

Towards Explaining Expressive Qualities in Piano Recordings: Transfer of Explanatory Features via Acoustic Domain Adaptation

Emotion and expressivity in music have been topics of considerable inter...
research
05/14/2023

SongDriver2: Real-time Emotion-based Music Arrangement with Soft Transition

Real-time emotion-based music arrangement, which aims to transform a giv...
research
06/27/2018

Modeling Majorness as a Perceptual Property in Music from Listener Ratings

For the tasks of automatic music emotion recognition, genre recognition,...

Please sign up or login with your details

Forgot password? Click here to reset