End-to-End Synthetic Data Generation for Domain Adaptation of Question Answering Systems

10/12/2020
by   Siamak Shakeri, et al.
0

We propose an end-to-end approach for synthetic QA data generation. Our model comprises a single transformer-based encoder-decoder network that is trained end-to-end to generate both answers and questions. In a nutshell, we feed a passage to the encoder and ask the decoder to generate a question and an answer token-by-token. The likelihood produced in the generation process is used as a filtering score, which avoids the need for a separate filtering model. Our generator is trained by fine-tuning a pretrained LM using maximum likelihood estimation. The experimental results indicate significant improvements in the domain adaptation of QA models outperforming current state-of-the-art methods.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
12/02/2020

End-to-End QA on COVID-19: Domain Adaptation with Synthetic Training

End-to-end question answering (QA) requires both information retrieval (...
research
08/31/2021

Contrastive Domain Adaptation for Question Answering using Limited Text Corpora

Question generation has recently shown impressive results in customizing...
research
12/05/2022

Retrieval as Attention: End-to-end Learning of Retrieval and Reading within a Single Transformer

Systems for knowledge-intensive tasks such as open-domain question answe...
research
10/04/2021

Encoder Adaptation of Dense Passage Retrieval for Open-Domain Question Answering

One key feature of dense passage retrievers (DPR) is the use of separate...
research
04/07/2022

Parameter-Efficient Abstractive Question Answering over Tables or Text

A long-term ambition of information seeking QA systems is to reason over...
research
06/12/2019

Synthetic QA Corpora Generation with Roundtrip Consistency

We introduce a novel method of generating synthetic question answering c...
research
03/08/2021

Few-Shot Learning of an Interleaved Text Summarization Model by Pretraining with Synthetic Data

Interleaved texts, where posts belonging to different threads occur in a...

Please sign up or login with your details

Forgot password? Click here to reset