Clinical Text Generation through Leveraging Medical Concept and Relations

10/02/2019
by   Wangjin Lee, et al.
0

With a neural sequence generation model, this study aims to develop a method of writing the patient clinical texts given a brief medical history. As a proof-of-a-concept, we have demonstrated that it can be workable to use medical concept embedding in clinical text generation. Our model was based on the Sequence-to-Sequence architecture and trained with a large set of de-identified clinical text data. The quantitative result shows that our concept embedding method decreased the perplexity of the baseline architecture. Also, we discuss the analyzed results from a human evaluation performed by medical doctors.

READ FULL TEXT
research
09/08/2019

c-TextGen: Conditional Text Generation for Harmonious Human-Machine Interaction

In recent years, with the development of deep learning technology, text ...
research
05/07/2018

A Graph-to-Sequence Model for AMR-to-Text Generation

The problem of AMR-to-text generation is to recover a text representing ...
research
04/20/2019

Unsupervised Text Generation from Structured Data

This work presents a joint solution to two challenging tasks: text gener...
research
04/10/2020

Automated Spelling Correction for Clinical Text Mining in Russian

The main goal of this paper is to develop a spell checker module for cli...
research
05/15/2018

Generating Continuous Representations of Medical Texts

We present an architecture that generates medical texts while learning a...
research
06/21/2019

SurfCon: Synonym Discovery on Privacy-Aware Clinical Data

Unstructured clinical texts contain rich health-related information. To ...
research
03/10/2023

Is In-hospital Meta-information Useful for Abstractive Discharge Summary Generation?

During the patient's hospitalization, the physician must record daily ob...

Please sign up or login with your details

Forgot password? Click here to reset