Predicting Issue Types with seBERT

05/03/2022
by   Alexander Trautsch, et al.
0

Pre-trained transformer models are the current state-of-the-art for natural language models processing. seBERT is such a model, that was developed based on the BERT architecture, but trained from scratch with software engineering data. We fine-tuned this model for the NLBSE challenge for the task of issue type prediction. Our model dominates the baseline fastText for all three issue types in both recall and precisio to achieve an overall F1-score of 85.7 an increase of 4.1

READ FULL TEXT

page 1

page 2

page 3

research
04/10/2021

MIPT-NSU-UTMN at SemEval-2021 Task 5: Ensembling Learning with Pre-trained Language Models for Toxic Spans Detection

This paper describes our system for SemEval-2021 Task 5 on Toxic Spans D...
research
07/15/2019

Myers-Briggs Personality Classification and Personality-Specific Language Generation Using Pre-trained Language Models

The Myers-Briggs Type Indicator (MBTI) is a popular personality metric t...
research
08/15/2022

Continuous Active Learning Using Pretrained Transformers

Pre-trained and fine-tuned transformer models like BERT and T5 have impr...
research
04/05/2022

Multilinguals at SemEval-2022 Task 11: Transformer Based Architecture for Complex NER

We investigate the task of complex NER for the English language. The tas...
research
03/31/2022

CatIss: An Intelligent Tool for Categorizing Issues Reports using Transformers

Users use Issue Tracking Systems to keep track and manage issue reports ...
research
08/15/2023

Finding Stakeholder-Material Information from 10-K Reports using Fine-Tuned BERT and LSTM Models

All public companies are required by federal securities law to disclose ...
research
01/27/2022

Aspect-Based API Review Classification: How Far Can Pre-Trained Transformer Model Go?

APIs (Application Programming Interfaces) are reusable software librarie...

Please sign up or login with your details

Forgot password? Click here to reset