Pre-training Transformer Models with Sentence-Level Objectives for Answer Sentence Selection

05/20/2022
by   Luca Di Liello, et al.
0

An important task for designing QA systems is answer sentence selection (AS2): selecting the sentence containing (or constituting) the answer to a question from a set of retrieved relevant documents. In this paper, we propose three novel sentence-level transformer pre-training objectives that incorporate paragraph-level semantics within and across documents, to improve the performance of transformers for AS2, and mitigate the requirement of large labeled datasets. Our experiments on three public and one industrial AS2 datasets demonstrate the empirical superiority of our pre-trained transformers over baseline models such as RoBERTa and ELECTRA for AS2.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/24/2023

Context-Aware Transformer Pre-Training for Answer Sentence Selection

Answer Sentence Selection (AS2) is a core component for building an accu...
research
05/02/2022

Paragraph-based Transformer Pre-training for Multi-Sentence Inference

Inference tasks such as answer sentence selection (AS2) or fact verifica...
research
10/16/2020

Unsupervised Extractive Summarization by Pre-training Hierarchical Transformers

Unsupervised extractive document summarization aims to select important ...
research
06/01/2020

Context-based Transformer Models for Answer Sentence Selection

An important task for the design of Question Answering systems is the se...
research
06/09/2023

FPDM: Domain-Specific Fast Pre-training Technique using Document-Level Metadata

Pre-training Transformers has shown promising results on open-domain and...
research
05/30/2019

A Compare-Aggregate Model with Latent Clustering for Answer Selection

In this paper, we propose a novel method for a sentence-level answer-sel...
research
03/17/2022

DP-KB: Data Programming with Knowledge Bases Improves Transformer Fine Tuning for Answer Sentence Selection

While transformers demonstrate impressive performance on many knowledge ...

Please sign up or login with your details

Forgot password? Click here to reset