BagBERT: BERT-based bagging-stacking for multi-topic classification

11/10/2021
by   Loïc Rakotoson, et al.
13

This paper describes our submission on the COVID-19 literature annotation task at Biocreative VII. We proposed an approach that exploits the knowledge of the globally non-optimal weights, usually rejected, to build a rich representation of each label. Our proposed approach consists of two stages: (1) A bagging of various initializations of the training data that features weakly trained weights, (2) A stacking of heterogeneous vocabulary models based on BERT and RoBERTa Embeddings. The aggregation of these weak insights performs better than a classical globally efficient model. The purpose is the distillation of the richness of knowledge to a simpler and lighter model. Our system obtains an Instance-based F1 of 92.96 and a Label-based micro-F1 of 91.35.

READ FULL TEXT

page 1

page 2

page 3

research
04/14/2022

Multi-label topic classification for COVID-19 literature with Bioformer

We describe Bioformer team's participation in the multi-label topic clas...
research
04/19/2022

LitMC-BERT: transformer-based multi-label classification of biomedical literature with an application on COVID-19 literature curation

The rapid growth of biomedical literature poses a significant challenge ...
research
08/24/2020

syrapropa at SemEval-2020 Task 11: BERT-based Models Design For Propagandistic Technique and Span Detection

This paper describes the BERT-based models proposed for two subtasks in ...
research
10/08/2021

Speeding up Deep Model Training by Sharing Weights and Then Unsharing

We propose a simple and efficient approach for training the BERT model. ...
research
08/24/2020

Two Stages Approach for Tweet Engagement Prediction

This paper describes the approach proposed by the D2KLab team for the 20...
research
07/23/2019

EmotionX-HSU: Adopting Pre-trained BERT for Emotion Classification

This paper describes our approach to the EmotionX-2019, the shared task ...
research
04/14/2021

Static Embeddings as Efficient Knowledge Bases?

Recent research investigates factual knowledge stored in large pretraine...

Please sign up or login with your details

Forgot password? Click here to reset