DeepAI AI Chat
Log In Sign Up

Self-Supervised Knowledge Assimilation for Expert-Layman Text Style Transfer

10/06/2021
by   Wenda Xu, et al.
The Regents of the University of California
5

Expert-layman text style transfer technologies have the potential to improve communication between members of scientific communities and the general public. High-quality information produced by experts is often filled with difficult jargon laypeople struggle to understand. This is a particularly notable issue in the medical domain, where layman are often confused by medical text online. At present, two bottlenecks interfere with the goal of building high-quality medical expert-layman style transfer systems: a dearth of pretrained medical-domain language models spanning both expert and layman terminologies and a lack of parallel corpora for training the transfer task itself. To mitigate the first issue, we propose a novel language model (LM) pretraining task, Knowledge Base Assimilation, to synthesize pretraining data from the edges of a graph of expert- and layman-style medical terminology terms into an LM during self-supervised learning. To mitigate the second issue, we build a large-scale parallel corpus in the medical expert-layman domain using a margin-based criterion. Our experiments show that transformer-based models pretrained on knowledge base assimilation and other well-established pretraining tasks fine-tuning on our new parallel corpus leads to considerable improvement against expert-layman transfer benchmarks, gaining an average relative improvement of our human evaluation, the Overall Success Rate (OSR), by 106

READ FULL TEXT

page 1

page 2

page 3

page 4

05/18/2022

Exploiting Social Media Content for Self-Supervised Style Transfer

Recent research on style transfer takes inspiration from unsupervised ne...
04/18/2022

Non-Parallel Text Style Transfer with Self-Parallel Supervision

The performance of existing text style transfer models is severely limit...
05/02/2020

Expertise Style Transfer: A New Task Towards Better Communication between Experts and Laymen

The curse of knowledge can impede communication between experts and laym...
12/06/2021

VAE based Text Style Transfer with Pivot Words Enhancement Learning

Text Style Transfer (TST) aims to alter the underlying style of the sour...
10/08/2019

Prose for a Painting

Painting captions are often dry and simplistic which motivates us to des...
05/13/2023

How to Train Your CheXDragon: Training Chest X-Ray Models for Transfer to Novel Tasks and Healthcare Systems

Self-supervised learning (SSL) enables label efficient training for mach...
11/29/2022

MoDA: Map style transfer for self-supervised Domain Adaptation of embodied agents

We propose a domain adaptation method, MoDA, which adapts a pretrained e...