ERNIE at SemEval-2020 Task 10: Learning Word Emphasis Selection by Pre-trained Language Model

by   Zhengjie Huang, et al.

This paper describes the system designed by ERNIE Team which achieved the first place in SemEval-2020 Task 10: Emphasis Selection For Written Text in Visual Media. Given a sentence, we are asked to find out the most important words as the suggestion for automated design. We leverage the unsupervised pre-training model and finetune these models on our task. After our investigation, we found that the following models achieved an excellent performance in this task: ERNIE 2.0, XLM-ROBERTA, ROBERTA and ALBERT. We combine a pointwise regression loss and a pairwise ranking loss which is more close to the final M atchm metric to finetune our models. And we also find that additional feature engineering and data augmentation can help improve the performance. Our best model achieves the highest score of 0.823 and ranks first for all kinds of metrics


page 1

page 2

page 3

page 4


IITK at SemEval-2020 Task 10: Transformers for Emphasis Selection

This paper describes the system proposed for addressing the research pro...

IDS at SemEval-2020 Task 10: Does Pre-trained Language Model Know What to Emphasize?

We propose a novel method that enables us to determine words that deserv...

MIDAS at SemEval-2020 Task 10: Emphasis Selection using Label Distribution Learning and Contextual Embeddings

This paper presents our submission to the SemEval 2020 - Task 10 on emph...

Cisco at AAAI-CAD21 shared task: Predicting Emphasis in Presentation Slides using Contextualised Embeddings

This paper describes our proposed system for the AAAI-CAD21 shared task:...

Prompt-based Pre-trained Model for Personality and Interpersonal Reactivity Prediction

This paper describes the LingJing team's method to the Workshop on Compu...

Towards Generalized Models for Task-oriented Dialogue Modeling on Spoken Conversations

Building robust and general dialogue models for spoken conversations is ...