UW-BHI at MEDIQA 2019: An Analysis of Representation Methods for Medical Natural Language Inference

07/09/2019
by   William R. Kearns, et al.
0

Recent advances in distributed language modeling have led to large performance increases on a variety of natural language processing (NLP) tasks. However, it is not well understood how these methods may be augmented by knowledge-based approaches. This paper compares the performance and internal representation of an Enhanced Sequential Inference Model (ESIM) between three experimental conditions based on the representation method: Bidirectional Encoder Representations from Transformers (BERT), Embeddings of Semantic Predications (ESP), or Cui2Vec. The methods were evaluated on the Medical Natural Language Inference (MedNLI) subtask of the MEDIQA 2019 shared task. This task relied heavily on semantic understanding and thus served as a suitable evaluation set for the comparison of these representation methods.

READ FULL TEXT

page 4

page 5

research
01/29/2021

Fine-tuning BERT-based models for Plant Health Bulletin Classification

In the era of digitization, different actors in agriculture produce nume...
research
05/03/2020

An Accurate Model for Predicting the (Graded) Effect of Context in Word Similarity Based on Bert

Natural Language Processing (NLP) has been widely used in the semantic a...
research
03/31/2023

BERTino: an Italian DistilBERT model

The recent introduction of Transformers language representation models a...
research
04/05/2022

Design considerations for a hierarchical semantic compositional framework for medical natural language understanding

Medical natural language processing (NLP) systems are a key enabling tec...
research
08/10/2020

GANBERT: Generative Adversarial Networks with Bidirectional Encoder Representations from Transformers for MRI to PET synthesis

Synthesizing medical images, such as PET, is a challenging task due to t...
research
07/31/2022

Building an Efficiency Pipeline: Commutativity and Cumulativeness of Efficiency Operators for Transformers

There exists a wide variety of efficiency methods for natural language p...
research
03/28/2019

Distilling Task-Specific Knowledge from BERT into Simple Neural Networks

In the natural language processing literature, neural networks are becom...

Please sign up or login with your details

Forgot password? Click here to reset