Explaining and Improving BERT Performance on Lexical Semantic Change Detection

03/12/2021
by   Severin Laicher, et al.
0

Type- and token-based embedding architectures are still competing in lexical semantic change detection. The recent success of type-based models in SemEval-2020 Task 1 has raised the question why the success of token-based models on a variety of other NLP tasks does not translate to our field. We investigate the influence of a range of variables on clusterings of BERT vectors and show that its low performance is largely due to orthographic information on the target word, which is encoded even in the higher layers of BERT representations. By reducing the influence of orthography we considerably improve BERT's performance.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2020

CL-IMS @ DIACR-Ita: Volente o Nolente: BERT does not outperform SGNS on Semantic Change Detection

We present the results of our participation in the DIACR-Ita shared task...
research
01/25/2020

Further Boosting BERT-based Models by Duplicating Existing Layers: Some Intriguing Phenomena inside BERT

Although Bidirectional Encoder Representations from Transformers (BERT) ...
research
09/17/2020

Compositional and Lexical Semantics in RoBERTa, BERT and DistilBERT: A Case Study on CoQA

Many NLP tasks have benefited from transferring knowledge from contextua...
research
04/30/2020

UiO-UvA at SemEval-2020 Task 1: Contextualised Embeddings for Lexical Semantic Change Detection

We apply contextualised word embeddings to lexical semantic change detec...
research
11/03/2020

Towards Automated Anamnesis Summarization: BERT-based Models for Symptom Extraction

Professionals in modern healthcare systems are increasingly burdened by ...
research
09/14/2023

Revisiting Supertagging for HPSG

We present new supertaggers trained on HPSG-based treebanks. These treeb...
research
03/27/2022

Pyramid-BERT: Reducing Complexity via Successive Core-set based Token Selection

Transformer-based language models such as BERT have achieved the state-o...

Please sign up or login with your details

Forgot password? Click here to reset