MANTIS at TSAR-2022 Shared Task: Improved Unsupervised Lexical Simplification with Pretrained Encoders

12/19/2022
by   Xiaofei Li, et al.
0

In this paper we present our contribution to the TSAR-2022 Shared Task on Lexical Simplification of the EMNLP 2022 Workshop on Text Simplification, Accessibility, and Readability. Our approach builds on and extends the unsupervised lexical simplification system with pretrained encoders (LSBert) system in the following ways: For the subtask of simplification candidate selection, it utilizes a RoBERTa transformer language model and expands the size of the generated candidate list. For subsequent substitution ranking, it introduces a new feature weighting scheme and adopts a candidate filtering method based on textual entailment to maximize semantic similarity between the target word and its simplification. Our best-performing system improves LSBert by 5.9

READ FULL TEXT

page 4

page 8

research
06/25/2020

LSBert: A Simple Framework for Lexical Simplification

Lexical simplification (LS) aims to replace complex words in a given sen...
research
02/10/2021

Language Models for Lexical Inference in Context

Lexical inference in context (LIiC) is the task of recognizing textual e...
research
03/10/2023

Logic Against Bias: Textual Entailment Mitigates Stereotypical Sentence Reasoning

Due to their similarity-based learning objectives, pretrained sentence e...
research
07/14/2019

A Simple BERT-Based Approach for Lexical Simplification

Lexical simplification (LS) aims to replace complex words in a given sen...
research
11/01/2021

Unsupervised Discovery of Unaccusative and Unergative Verbs

We present an unsupervised method to detect English unergative and unacc...
research
07/06/2023

LEA: Improving Sentence Similarity Robustness to Typos Using Lexical Attention Bias

Textual noise, such as typos or abbreviations, is a well-known issue tha...
research
10/02/2020

Which *BERT? A Survey Organizing Contextualized Encoders

Pretrained contextualized text encoders are now a staple of the NLP comm...

Please sign up or login with your details

Forgot password? Click here to reset