Automated Analysis and Prediction of Job Interview Performance

04/14/2015
by   Iftekhar Naim, et al.
0

We present a computational framework for automatically quantifying verbal and nonverbal behaviors in the context of job interviews. The proposed framework is trained by analyzing the videos of 138 interview sessions with 69 internship-seeking undergraduates at the Massachusetts Institute of Technology (MIT). Our automated analysis includes facial expressions (e.g., smiles, head gestures, facial tracking points), language (e.g., word counts, topic modeling), and prosodic information (e.g., pitch, intonation, and pauses) of the interviewees. The ground truth labels are derived by taking a weighted average over the ratings of 9 independent judges. Our framework can automatically predict the ratings for interview traits such as excitement, friendliness, and engagement with correlation coefficients of 0.75 or higher, and can quantify the relative importance of prosody, language, and facial expressions. By analyzing the relative feature weights learned by the regression models, our framework recommends to speak more fluently, use less filler words, speak as "we" (vs. "I"), use more unique words, and smile more. We also find that the students who were rated highly while answering the first interview question were also rated highly overall (i.e., first impression matters). Finally, our MIT Interview dataset will be made available to other researchers to further validate and expand our findings.

READ FULL TEXT

page 4

page 8

page 9

page 10

page 11

research
05/01/2020

Recognizing American Sign Language Nonmanual Signal Grammar Errors in Continuous Videos

As part of the development of an educational tool that can help students...
research
08/15/2014

Turkish Presidential Elections TRT Publicity Speech Facial Expression Analysis

In this paper, facial expressions of the three Turkish presidential cand...
research
02/10/2020

FAU, Facial Expressions, Valence and Arousal: A Multi-task Solution

In the paper, we aim to train a unified model that performs three tasks:...
research
09/18/2019

Student Engagement Detection Using Emotion Analysis, Eye Tracking and Head Movement with Machine Learning

With the increase of distance learning, in general, and e-learning, in p...
research
05/21/2019

Predicting TED Talk Ratings from Language and Prosody

We use the largest open repository of public speaking—TED Talks—to predi...
research
07/18/2023

Occlusion Aware Student Emotion Recognition based on Facial Action Unit Detection

Given that approximately half of science, technology, engineering, and m...
research
02/10/2020

Vision based body gesture meta features for Affective Computing

Early detection of psychological distress is key to effective treatment....

Please sign up or login with your details

Forgot password? Click here to reset