Pushing on Personality Detection from Verbal Behavior: A Transformer Meets Text Contours of Psycholinguistic Features

04/10/2022
by   Elma Kerz, et al.
0

Research at the intersection of personality psychology, computer science, and linguistics has recently focused increasingly on modeling and predicting personality from language use. We report two major improvements in predicting personality traits from text data: (1) to our knowledge, the most comprehensive set of theory-based psycholinguistic features and (2) hybrid models that integrate a pre-trained Transformer Language Model BERT and Bidirectional Long Short-Term Memory (BLSTM) networks trained on within-text distributions ('text contours') of psycholinguistic features. We experiment with BLSTM models (with and without Attention) and with two techniques for applying pre-trained language representations from the transformer model - 'feature-based' and 'fine-tuning'. We evaluate the performance of the models we built on two benchmark datasets that target the two dominant theoretical models of personality: the Big Five Essay dataset and the MBTI Kaggle dataset. Our results are encouraging as our models outperform existing work on the same datasets. More specifically, our models achieve improvement in classification accuracy by 2.9 addition, we perform ablation experiments to quantify the impact of different categories of psycholinguistic features in the respective personality prediction models.

READ FULL TEXT

page 12

page 13

research
12/19/2022

Improving the Generalizability of Text-Based Emotion Detection by Leveraging Transformers with Psycholinguistic Features

In recent years, there has been increased interest in building predictiv...
research
04/22/2020

Keyphrase Prediction With Pre-trained Language Model

Recently, generative methods have been widely used in keyphrase predicti...
research
10/31/2022

Leveraging Pre-trained Models for Failure Analysis Triplets Generation

Pre-trained Language Models recently gained traction in the Natural Lang...
research
08/06/2019

Predicting Prosodic Prominence from Text with Pre-trained Contextualized Word Representations

In this paper we introduce a new natural language processing dataset and...
research
10/25/2019

FineText: Text Classification via Attention-based Language Model Fine-tuning

Training deep neural networks from scratch on natural language processin...
research
02/21/2023

Co-Driven Recognition of Semantic Consistency via the Fusion of Transformer and HowNet Sememes Knowledge

Semantic consistency recognition aims to detect and judge whether the se...
research
01/21/2023

REDAffectiveLM: Leveraging Affect Enriched Embedding and Transformer-based Neural Language Model for Readers' Emotion Detection

Technological advancements in web platforms allow people to express and ...

Please sign up or login with your details

Forgot password? Click here to reset