A Simple BERT-Based Approach for Lexical Simplification

07/14/2019
by   Jipeng Qiang, et al.
0

Lexical simplification (LS) aims to replace complex words in a given sentence with their simpler alternatives of equivalent meaning. Recently unsupervised lexical simplification approaches only rely on the complex word itself regardless of the given sentence to generate candidate substitutions, which will inevitably produce a large number of spurious candidates. We present a simple BERT-based LS approach that makes use of the pre-trained unsupervised deep bidirectional representations BERT. We feed the given sentence masked the complex word into the masking language model of BERT to generate candidate substitutions. By considering the whole sentence, the generated simpler alternatives are easier to hold cohesion and coherence of a sentence. Experimental results show that our approach obtains obvious improvement on standard LS benchmark.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/25/2020

LSBert: A Simple Framework for Lexical Simplification

Lexical simplification (LS) aims to replace complex words in a given sen...
research
10/14/2020

Chinese Lexical Simplification

Lexical simplification has attracted much attention in many languages, w...
research
12/30/2020

Enhancing Pre-trained Language Model with Lexical Simplification

For both human readers and pre-trained language models (PrLMs), lexical ...
research
12/15/2021

Tracing Text Provenance via Context-Aware Lexical Substitution

Text content created by humans or language models is often stolen or mis...
research
05/14/2023

ParaLS: Lexical Substitution via Pretrained Paraphraser

Lexical substitution (LS) aims at finding appropriate substitutes for a ...
research
01/15/2022

Automatic Lexical Simplification for Turkish

In this paper, we present the first automatic lexical simplification sys...
research
12/19/2022

MANTIS at TSAR-2022 Shared Task: Improved Unsupervised Lexical Simplification with Pretrained Encoders

In this paper we present our contribution to the TSAR-2022 Shared Task o...

Please sign up or login with your details

Forgot password? Click here to reset