DoubleTransfer at MEDIQA 2019: Multi-Source Transfer Learning for Natural Language Understanding in the Medical Domain

06/11/2019
by   Yichong Xu, et al.
0

This paper describes our competing system to enter the MEDIQA-2019 competition. We use a multi-source transfer learning approach to transfer the knowledge from MT-DNN and SciBERT to natural language understanding tasks in the medical domain. For transfer learning fine-tuning, we use multi-task learning on NLI, RQE and QA tasks on general and medical domains to improve performance. The proposed methods are proved effective for natural language understanding in the medical domain, and we rank the first place on the QA task.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/05/2020

Language Model is All You Need: Natural Language Understanding as Question Answering

Different flavors of transfer learning have shown tremendous impact in a...
research
03/11/2019

Practical Semantic Parsing for Spoken Language Understanding

Executable semantic parsing is the task of converting natural language u...
research
04/20/2018

GLUE: A Multi-Task Benchmark and Analysis Platform for Natural Language Understanding

For natural language understanding (NLU) technology to be maximally usef...
research
07/12/2023

Prototypical Contrastive Transfer Learning for Multimodal Language Understanding

Although domestic service robots are expected to assist individuals who ...
research
10/13/2021

Winning the ICCV'2021 VALUE Challenge: Task-aware Ensemble and Transfer Learning with Visual Concepts

The VALUE (Video-And-Language Understanding Evaluation) benchmark is new...
research
04/23/2018

Dropping Networks for Transfer Learning

In natural language understanding, many challenges require learning rela...
research
05/23/2022

Vector-Quantized Input-Contextualized Soft Prompts for Natural Language Understanding

Prompt Tuning (PT) has been largely successful as a parameter-efficient ...

Please sign up or login with your details

Forgot password? Click here to reset