Order Matters: Sequence to sequence for sets

11/19/2015
by   Oriol Vinyals, et al.
0

Sequences have become first class citizens in supervised learning thanks to the resurgence of recurrent neural networks. Many complex tasks that require mapping from or to a sequence of observations can now be formulated with the sequence-to-sequence (seq2seq) framework which employs the chain rule to efficiently represent the joint probability of sequences. In many cases, however, variable sized inputs and/or outputs might not be naturally expressed as sequences. For instance, it is not clear how to input a set of numbers into a model where the task is to sort them; similarly, we do not know how to organize outputs when they correspond to random variables and the task is to model their unknown joint probability. In this paper, we first show using various examples that the order in which we organize input and/or output data matters significantly when learning an underlying model. We then discuss an extension of the seq2seq framework that goes beyond sequences and handles input sets in a principled way. In addition, we propose a loss which, by searching over possible orders during training, deals with the lack of structure of output sets. We show empirical evidence of our claims regarding ordering, and on the modifications to the seq2seq framework on benchmark language modeling and parsing tasks, as well as two artificial tasks -- sorting numbers and estimating the joint probability of unknown graphical models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
11/14/2012

Sequence Transduction with Recurrent Neural Networks

Many machine learning tasks can be expressed as the transformation---or ...
research
06/25/2020

Sequence to Multi-Sequence Learning via Conditional Chain Mapping for Mixture Signals

Neural sequence-to-sequence models are well established for applications...
research
03/12/2019

A Sequential Set Generation Method for Predicting Set-Valued Outputs

Consider a general machine learning setting where the output is a set of...
research
06/09/2015

Pointer Networks

We introduce a new neural architecture to learn the conditional probabil...
research
11/16/2015

A Neural Transducer

Sequence-to-sequence models have achieved impressive results on various ...
research
11/21/2019

System Identification with Time-Aware Neural Sequence Models

Established recurrent neural networks are well-suited to solve a wide va...
research
06/15/2020

Neural Execution Engines: Learning to Execute Subroutines

A significant effort has been made to train neural networks that replica...

Please sign up or login with your details

Forgot password? Click here to reset