CopyNext: Explicit Span Copying and Alignment in Sequence to Sequence Models

10/28/2020
by   Abhinav Singh, et al.
0

Copy mechanisms are employed in sequence to sequence models (seq2seq) to generate reproductions of words from the input to the output. These frameworks, operating at the lexical type level, fail to provide an explicit alignment that records where each token was copied from. Further, they require contiguous token sequences from the input (spans) to be copied individually. We present a model with an explicit token-level copy operation and extend it to copying entire spans. Our model provides hard alignments between spans in the input and output, allowing for nontraditional applications of seq2seq, like information extraction. We demonstrate the approach on Nested Named Entity Recognition, achieving near state-of-the-art accuracy with an order of magnitude increase in decoding speed.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/18/2021

De-identification of Unstructured Clinical Texts from Sequence to Sequence Perspective

In this work, we propose a novel problem formulation for de-identificati...
research
06/08/2020

Copy that! Editing Sequences by Copying Spans

Neural sequence-to-sequence models are finding increasing use in editing...
research
07/16/2017

End-to-End Information Extraction without Token-Level Supervision

Most state-of-the-art information extraction approaches rely on token-le...
research
01/14/2022

Sequence-to-Sequence Models for Extracting Information from Registration and Legal Documents

A typical information extraction pipeline consists of token- or span-lev...
research
03/18/2020

TTTTTackling WinoGrande Schemas

We applied the T5 sequence-to-sequence model to tackle the AI2 WinoGrand...
research
02/05/2023

Unleashing the True Potential of Sequence-to-Sequence Models for Sequence Tagging and Structure Parsing

Sequence-to-Sequence (S2S) models have achieved remarkable success on va...
research
04/23/2017

Differentiable Scheduled Sampling for Credit Assignment

We demonstrate that a continuous relaxation of the argmax operation can ...

Please sign up or login with your details

Forgot password? Click here to reset