DeepAI AI Chat
Log In Sign Up

Improving Conditioning in Context-Aware Sequence to Sequence Models

by   Xinyi Wang, et al.
Carnegie Mellon University

Neural sequence to sequence models are well established for applications which can be cast as mapping a single input sequence into a single output sequence. In this work, we focus on cases where generation is conditioned on both a short query and a long context, such as abstractive question answering or document-level translation. We modify the standard sequence-to-sequence approach to make better use of both the query and the context by expanding the conditioning mechanism to intertwine query and context attention. We also introduce a simple and efficient data augmentation method for the proposed model. Experiments on three different tasks show that both changes lead to consistent improvements.


Sequence to Multi-Sequence Learning via Conditional Chain Mapping for Mixture Signals

Neural sequence-to-sequence models are well established for applications...

Copy this Sentence

Attention is an operation that selects some largest element from some se...

Sequence to Sequence Learning for Query Expansion

Using sequence to sequence algorithms for query expansion has not been e...

Seq2Seq-Vis: A Visual Debugging Tool for Sequence-to-Sequence Models

Neural Sequence-to-Sequence models have proven to be accurate and robust...

Imperial College London Submission to VATEX Video Captioning Task

This paper describes the Imperial College London team's submission to th...

Plan, Attend, Generate: Planning for Sequence-to-Sequence Models

We investigate the integration of a planning mechanism into sequence-to-...

Using Local Knowledge Graph Construction to Scale Seq2Seq Models to Multi-Document Inputs

Query-based open-domain NLP tasks require information synthesis from lon...