Neural Network architectures to classify emotions in Indian Classical Music

02/01/2021
by   Uddalok Sarkar, et al.
0

Music is often considered as the language of emotions. It has long been known to elicit emotions in human being and thus categorizing music based on the type of emotions they induce in human being is a very intriguing topic of research. When the task comes to classify emotions elicited by Indian Classical Music (ICM), it becomes much more challenging because of the inherent ambiguity associated with ICM. The fact that a single musical performance can evoke a variety of emotional response in the audience is implicit to the nature of ICM renditions. With the rapid advancements in the field of Deep Learning, this Music Emotion Recognition (MER) task is becoming more and more relevant and robust, hence can be applied to one of the most challenging test case i.e. classifying emotions elicited from ICM. In this paper we present a new dataset called JUMusEmoDB which presently has 400 audio clips (30 seconds each) where 200 clips correspond to happy emotions and the remaining 200 clips correspond to sad emotion. For supervised classification purposes, we have used 4 existing deep Convolutional Neural Network (CNN) based architectures (resnet18, mobilenet v2.0, squeezenet v1.0 and vgg16) on corresponding music spectrograms of the 2000 sub-clips (where every clip was segmented into 5 sub-clips of about 5 seconds each) which contain both time as well as frequency domain information. The initial results are quite inspiring, and we look forward to setting the baseline values for the dataset using this architecture. This type of CNN based classification algorithm using a rich corpus of Indian Classical Music is unique even in the global perspective and can be replicated in other modalities of music also. This dataset is still under development and we plan to include more data containing other emotional features as well. We plan to make the dataset publicly available soon.

READ FULL TEXT
research
05/29/2019

Multilabel Automated Recognition of Emotions Induced Through Music

Achieving advancements in automatic recognition of emotions that music c...
research
04/19/2017

CNN based music emotion classification

Music emotion recognition (MER) is usually regarded as a multi-label tag...
research
07/27/2023

Emotion4MIDI: a Lyrics-based Emotion-Labeled Symbolic Music Dataset

We present a new large-scale emotion-labeled symbolic music dataset cons...
research
07/19/2023

Mood Classification of Bangla Songs Based on Lyrics

Music can evoke various emotions, and with the advancement of technology...
research
06/04/2021

Musical Prosody-Driven Emotion Classification: Interpreting Vocalists Portrayal of Emotions Through Machine Learning

The task of classifying emotions within a musical track has received wid...
research
07/08/2023

Emotion-Guided Music Accompaniment Generation Based on Variational Autoencoder

Music accompaniment generation is a crucial aspect in the composition pr...
research
09/06/2017

Affect Recognition in Ads with Application to Computational Advertising

Advertisements (ads) often include strongly emotional content to leave a...

Please sign up or login with your details

Forgot password? Click here to reset