Task Transfer and Domain Adaptation for Zero-Shot Question Answering

06/14/2022
by   Xiang Pan, et al.
0

Pretrained language models have shown success in various areas of natural language processing, including reading comprehension tasks. However, when applying machine learning methods to new domains, labeled data may not always be available. To address this, we use supervised pretraining on source-domain data to reduce sample complexity on domain-specific downstream tasks. We evaluate zero-shot performance on domain-specific reading comprehension tasks by combining task transfer with domain adaptation to fine-tune a pretrained model with no labelled data from the target task. Our approach outperforms Domain-Adaptive Pretraining on downstream domain-specific reading comprehension tasks in 3 out of 4 domains.

READ FULL TEXT
research
11/25/2019

Unsupervised Domain Adaptation of Language Models for Reading Comprehension

This study tackles unsupervised domain adaptation of reading comprehensi...
research
05/15/2022

Not to Overfit or Underfit? A Study of Domain Generalization in Question Answering

Machine learning models are prone to overfitting their source (training)...
research
06/30/2021

Zero-Shot Estimation of Base Models' Weights in Ensemble of Machine Reading Comprehension Systems for Robust Generalization

One of the main challenges of the machine reading comprehension (MRC) mo...
research
03/30/2022

Clozer: Adaptable Data Augmentation for Cloze-style Reading Comprehension

Task-adaptive pre-training (TAPT) alleviates the lack of labelled data a...
research
03/30/2023

Whether and When does Endoscopy Domain Pretraining Make Sense?

Automated endoscopy video analysis is a challenging task in medical comp...
research
05/24/2023

UniChart: A Universal Vision-language Pretrained Model for Chart Comprehension and Reasoning

Charts are very popular for analyzing data, visualizing key insights and...
research
11/01/2019

Forget Me Not: Reducing Catastrophic Forgetting for Domain Adaptation in Reading Comprehension

The creation of large-scale open domain reading comprehension data sets ...

Please sign up or login with your details

Forgot password? Click here to reset