Self-Supervised Knowledge Assimilation for Expert-Layman Text Style Transfer

10/06/2021
by   Wenda Xu, et al.
5

Expert-layman text style transfer technologies have the potential to improve communication between members of scientific communities and the general public. High-quality information produced by experts is often filled with difficult jargon laypeople struggle to understand. This is a particularly notable issue in the medical domain, where layman are often confused by medical text online. At present, two bottlenecks interfere with the goal of building high-quality medical expert-layman style transfer systems: a dearth of pretrained medical-domain language models spanning both expert and layman terminologies and a lack of parallel corpora for training the transfer task itself. To mitigate the first issue, we propose a novel language model (LM) pretraining task, Knowledge Base Assimilation, to synthesize pretraining data from the edges of a graph of expert- and layman-style medical terminology terms into an LM during self-supervised learning. To mitigate the second issue, we build a large-scale parallel corpus in the medical expert-layman domain using a margin-based criterion. Our experiments show that transformer-based models pretrained on knowledge base assimilation and other well-established pretraining tasks fine-tuning on our new parallel corpus leads to considerable improvement against expert-layman transfer benchmarks, gaining an average relative improvement of our human evaluation, the Overall Success Rate (OSR), by 106

READ FULL TEXT

page 1

page 2

page 3

page 4

research
09/15/2023

ICLEF: In-Context Learning with Expert Feedback for Explainable Style Transfer

While state-of-the-art language models excel at the style transfer task,...
research
09/14/2023

Speech-to-Speech Translation with Discrete-Unit-Based Style Transfer

Direct speech-to-speech translation (S2ST) with discrete self-supervised...
research
06/04/2023

RadLing: Towards Efficient Radiology Report Understanding

Most natural language tasks in the radiology domain use language models ...
research
05/18/2022

Exploiting Social Media Content for Self-Supervised Style Transfer

Recent research on style transfer takes inspiration from unsupervised ne...
research
05/02/2020

Expertise Style Transfer: A New Task Towards Better Communication between Experts and Laymen

The curse of knowledge can impede communication between experts and laym...
research
11/29/2022

MoDA: Map style transfer for self-supervised Domain Adaptation of embodied agents

We propose a domain adaptation method, MoDA, which adapts a pretrained e...
research
07/07/2023

Derivative Free Weight-space Ensembling

Recent work suggests that interpolating between the weights of two speci...

Please sign up or login with your details

Forgot password? Click here to reset