-
Investigating the Successes and Failures of BERT for Passage Re-Ranking
The bidirectional encoder representations from transformers (BERT) model...
read it
-
LT@Helsinki at SemEval-2020 Task 12: Multilingual or language-specific BERT?
This paper presents the different models submitted by the LT@Helsinki te...
read it
-
Understanding BERT performance in propaganda analysis
In this paper, we describe our system used in the shared task for fine-g...
read it
-
Look Again at the Syntax: Relational Graph Convolutional Network for Gendered Ambiguous Pronoun Resolution
Gender bias has been found in existing coreference resolvers. In order t...
read it
-
Evaluating robustness of language models for chief complaint extraction from patient-generated text
Automated classification of chief complaints from patient-generated text...
read it
-
Fast and Accurate Deep Bidirectional Language Representations for Unsupervised Learning
Even though BERT achieves successful performance improvements in various...
read it
-
ColBERT: Using BERT Sentence Embedding for Humor Detection
Automatic humor detection has interesting use cases in modern technologi...
read it
DSC IIT-ISM at SemEval-2020 Task 6: Boosting BERT with Dependencies for Definition Extraction
We explore the performance of Bidirectional Encoder Representations from Transformers (BERT) at definition extraction. We further propose a joint model of BERT and Text Level Graph Convolutional Network so as to incorporate dependencies into the model. Our proposed model produces better results than BERT and achieves comparable results to BERT with fine tuned language model in DeftEval (Task 6 of SemEval 2020), a shared task of classifying whether a sentence contains a definition or not (Subtask 1).
READ FULL TEXT
Comments
There are no comments yet.