MIDAS at SemEval-2020 Task 10: Emphasis Selection using Label Distribution Learning and Contextual Embeddings

09/06/2020
by   Sarthak Anand, et al.
0

This paper presents our submission to the SemEval 2020 - Task 10 on emphasis selection in written text. We approach this emphasis selection problem as a sequence labeling task where we represent the underlying text with various contextual embedding models. We also employ label distribution learning to account for annotator disagreements. We experiment with the choice of model architectures, trainability of layers, and different contextual embeddings. Our best performing architecture is an ensemble of different models, which achieved an overall matching score of 0.783, placing us 15th out of 31 participating teams. Lastly, we analyze the results in terms of parts of speech tags, sentence lengths, and word ordering.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/21/2020

IITK at SemEval-2020 Task 10: Transformers for Emphasis Selection

This paper describes the system proposed for addressing the research pro...
research
09/08/2020

ERNIE at SemEval-2020 Task 10: Learning Word Emphasis Selection by Pre-trained Language Model

This paper describes the system designed by ERNIE Team which achieved th...
research
01/10/2021

Cisco at AAAI-CAD21 shared task: Predicting Emphasis in Presentation Slides using Contextualised Embeddings

This paper describes our proposed system for the AAAI-CAD21 shared task:...
research
10/19/2019

Keyphrase Extraction from Scholarly Articles as Sequence Labeling using Contextualized Embeddings

In this paper, we formulate keyphrase extraction from scholarly articles...
research
08/29/2021

Sentence Structure and Word Relationship Modeling for Emphasis Selection

Emphasis Selection is a newly proposed task which focuses on choosing wo...
research
03/30/2021

Locally-Contextual Nonlinear CRFs for Sequence Labeling

Linear chain conditional random fields (CRFs) combined with contextual w...
research
09/17/2020

More Embeddings, Better Sequence Labelers?

Recent work proposes a family of contextual embeddings that significantl...

Please sign up or login with your details

Forgot password? Click here to reset