Transformers to Learn Hierarchical Contexts in Multiparty Dialogue for Span-based Question Answering

04/07/2020
by   Changmao Li, et al.
0

We introduce a novel approach to transformers that learns hierarchical representations in multiparty dialogue. First, three language modeling tasks are used to pre-train the transformers, token- and utterance-level language modeling and utterance order prediction, that learn both token and utterance embeddings for better understanding in dialogue contexts. Then, multi-task learning between the utterance prediction and the token span prediction is applied to fine-tune for span-based question answering (QA). Our approach is evaluated on the FriendsQA dataset and shows improvements of 3.8 the two state-of-the-art transformer models, BERT and RoBERTa, respectively.

READ FULL TEXT
research
04/19/2019

Unifying Question Answering and Text Classification via Span Extraction

Even as pre-trained language encoders such as BERT are shared across man...
research
07/24/2019

SpanBERT: Improving Pre-training by Representing and Predicting Spans

We present SpanBERT, a pre-training method that is designed to better re...
research
09/11/2019

How Does BERT Answer Questions? A Layer-Wise Analysis of Transformer Representations

Bidirectional Encoder Representations from Transformers (BERT) reach sta...
research
06/22/2022

Hierarchical Context Tagging for Utterance Rewriting

Utterance rewriting aims to recover coreferences and omitted information...
research
06/21/2023

Iterated Piecewise Affine (IPA) Approximation for Language Modeling

In this work, we demonstrate the application of a simple first-order Tay...
research
05/13/2021

Not All Memories are Created Equal: Learning to Forget by Expiring

Attention mechanisms have shown promising results in sequence modeling t...
research
07/31/2022

Neural Knowledge Bank for Pretrained Transformers

The ability of pretrained Transformers to remember factual knowledge is ...

Please sign up or login with your details

Forgot password? Click here to reset