Log In Sign Up

Direct Classification of Emotional Intensity

by   Jacob Ouyang, et al.

In this paper, we present a model that can directly predict emotion intensity score from video inputs, instead of deriving from action units. Using a 3d DNN incorporated with dynamic emotion information, we train a model using videos of different people smiling that outputs an intensity score from 0-10. Each video is labeled framewise using a normalized action-unit based intensity score. Our model then employs an adaptive learning technique to improve performance when dealing with new subjects. Compared to other models, our model excels in generalization between different people as well as provides a new framework to directly classify emotional intensity.


page 1

page 2

page 3

page 4


EmoDiff: Intensity Controllable Emotional Text-to-Speech with Soft-Label Guidance

Although current neural text-to-speech (TTS) models are able to generate...

How the emotion's type and intensity affect rumor spreading

The implication and contagion effect of emotion cannot be ignored in rum...

Pain Intensity Estimation by a Self--Taught Selection of Histograms of Topographical Features

Pain assessment through observational pain scales is necessary for speci...

Re-presenting a Story by Emotional Factors using Sentimental Analysis Method

Remembering an event is affected by personal emotional status. We examin...

U-Singer: Multi-Singer Singing Voice Synthesizer that Controls Emotional Intensity

We propose U-Singer, the first multi-singer emotional singing voice synt...

EmoAtt at EmoInt-2017: Inner attention sentence embedding for Emotion Intensity

In this paper we describe a deep learning system that has been designed ...

Emotion Ratings: How Intensity, Annotation Confidence and Agreements are Entangled

When humans judge the affective content of texts, they also implicitly a...