Towards No.1 in CLUE Semantic Matching Challenge: Pre-trained Language Model Erlangshen with Propensity-Corrected Loss

08/05/2022
by   Junjie Wang, et al.
0

This report describes a pre-trained language model Erlangshen with propensity-corrected loss, the No.1 in CLUE Semantic Matching Challenge. In the pre-training stage, we construct a dynamic masking strategy based on knowledge in Masked Language Modeling (MLM) with whole word masking. Furthermore, by observing the specific structure of the dataset, the pre-trained Erlangshen applies propensity-corrected loss (PCL) in the fine-tuning phase. Overall, we achieve 72.54 points in F1 Score and 78.90 points in Accuracy on the test set. Our code is publicly available at: https://github.com/IDEA-CCNL/Fengshenbang-LM/tree/hf-ds/fengshen/examples/clue_sim.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/10/2022

LERT: A Linguistically-motivated Pre-trained Language Model

Pre-trained Language Model (PLM) has become a representative foundation ...
research
03/24/2023

Prompt Tuning based Adapter for Vision-Language Model Adaption

Large pre-trained vision-language (VL) models have shown significant pro...
research
04/19/2019

Suggestion Mining from Online Reviews using ULMFiT

In this paper we present our approach and the system description for Sub...
research
07/06/2019

Applying a Pre-trained Language Model to Spanish Twitter Humor Prediction

Our entry into the HAHA 2019 Challenge placed 3^rd in the classification...
research
06/28/2023

S2SNet: A Pretrained Neural Network for Superconductivity Discovery

Superconductivity allows electrical current to flow without any energy l...
research
08/01/2019

MSnet: A BERT-based Network for Gendered Pronoun Resolution

The pre-trained BERT model achieves a remarkable state of the art across...
research
06/29/2022

GPTs at Factify 2022: Prompt Aided Fact-Verification

One of the most pressing societal issues is the fight against false news...

Please sign up or login with your details

Forgot password? Click here to reset