Consistent Multiple Sequence Decoding

04/02/2020
by   Bicheng Xu, et al.
0

Sequence decoding is one of the core components of most visual-lingual models. However, typical neural decoders when faced with decoding multiple, possibly correlated, sequences of tokens resort to simple independent decoding schemes. In this paper, we introduce a consistent multiple sequence decoding architecture, which is while relatively simple, is general and allows for consistent and simultaneous decoding of an arbitrary number of sequences. Our formulation utilizes a consistency fusion mechanism, implemented using message passing in a Graph Neural Network (GNN), to aggregate context from related decoders. This context is then utilized as a secondary input, in addition to previously generated output, to make a prediction at a given step of decoding. Self-attention, in the GNN, is used to modulate the fusion mechanism locally at each node and each step in the decoding process. We show the efficacy of our consistent multiple sequence decoder on the task of dense relational image captioning and illustrate state-of-the-art performance (+ 5.2 task. More importantly, we illustrate that the decoded sentences, for the same regions, are more consistent (improvement of 9.5 regions maintain diversity.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/28/2018

Middle-Out Decoding

Despite being virtually ubiquitous, sequence-to-sequence models are chal...
research
09/19/2019

Adaptively Aligned Image Captioning via Adaptive Attention Time

Recent neural models for image captioning usually employs an encoder-dec...
research
11/13/2022

A Scalable Graph Neural Network Decoder for Short Block Codes

In this work, we propose a novel decoding algorithm for short block code...
research
07/29/2022

Graph Neural Networks for Channel Decoding

In this work, we propose a fully differentiable graph neural network (GN...
research
04/04/2020

Graph Sequential Network for Reasoning over Sequences

Recently Graph Neural Network (GNN) has been applied successfully to var...
research
11/07/2018

Blockwise Parallel Decoding for Deep Autoregressive Models

Deep autoregressive sequence-to-sequence models have demonstrated impres...
research
03/22/2020

A Better Variant of Self-Critical Sequence Training

In this work, we present a simple yet better variant of Self-Critical Se...

Please sign up or login with your details

Forgot password? Click here to reset