BERT4SO: Neural Sentence Ordering by Fine-tuning BERT

by   Yutao Zhu, et al.
Université de Montréal
Montréal Institute of Learning Algorithms
NetEase, Inc

Sentence ordering aims to arrange the sentences of a given text in the correct order. Recent work frames it as a ranking problem and applies deep neural networks to it. In this work, we propose a new method, named BERT4SO, by fine-tuning BERT for sentence ordering. We concatenate all sentences and compute their representations by using multiple special tokens and carefully designed segment (interval) embeddings. The tokens across multiple sentences can attend to each other which greatly enhances their interactions. We also propose a margin-based listwise ranking loss based on ListMLE to facilitate the optimization process. Experimental results on five benchmark datasets demonstrate the effectiveness of our proposed method.


page 1

page 2

page 3

page 4


Topological Sort for Sentence Ordering

Sentence ordering is the task of arranging the sentences of a given text...

Sentence Embeddings using Supervised Contrastive Learning

Sentence embeddings encode sentences in fixed dense vectors and have pla...

Deep Attentive Ranking Networks for Learning to Order Sentences

We present an attention-based ranking framework for learning to order se...

Neural Sentence Ordering Based on Constraint Graphs

Sentence ordering aims at arranging a list of sentences in the correct o...

Graph-based Neural Sentence Ordering

Sentence ordering is to restore the original paragraph from a set of sen...

Explain to me like I am five – Sentence Simplification Using Transformers

Sentence simplification aims at making the structure of text easier to r...

The heads hypothesis: A unifying statistical approach towards understanding multi-headed attention in BERT

Multi-headed attention heads are a mainstay in transformer-based models....

Please sign up or login with your details

Forgot password? Click here to reset