Myers-Briggs Personality Classification and Personality-Specific Language Generation Using Pre-trained Language Models

07/15/2019
by   Sedrick Scott Keh, et al.
0

The Myers-Briggs Type Indicator (MBTI) is a popular personality metric that uses four dichotomies as indicators of personality traits. This paper examines the use of pre-trained language models to predict MBTI personality types based on scraped labeled texts. The proposed model reaches an accuracy of 0.47 for correctly predicting all 4 types and 0.86 for correctly predicting at least 2 types. Furthermore, we investigate the possible uses of a fine-tuned BERT model for personality-specific language generation. This is a task essential for both modern psychology and for intelligent empathetic systems.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/03/2022

Predicting Issue Types with seBERT

Pre-trained transformer models are the current state-of-the-art for natu...
research
05/25/2021

NukeLM: Pre-Trained and Fine-Tuned Language Models for the Nuclear and Energy Domains

Natural language processing (NLP) tasks (text classification, named enti...
research
10/13/2022

Sentence Ambiguity, Grammaticality and Complexity Probes

It is unclear whether, how and where large pre-trained language models c...
research
03/30/2022

Position-based Prompting for Health Outcome Generation

Probing Pre-trained Language Models (PLMs) using prompts has indirectly ...
research
02/02/2022

Toward a traceable, explainable, and fairJD/Resume recommendation system

In the last few decades, companies are interested to adopt an online aut...
research
04/21/2023

KitchenScale: Learning to predict ingredient quantities from recipe contexts

Determining proper quantities for ingredients is an essential part of co...
research
06/21/2023

Black-Box Prediction of Flaky Test Fix Categories Using Language Models

Flaky tests are problematic because they non-deterministically pass or f...

Please sign up or login with your details

Forgot password? Click here to reset