Using BERT Encoding and Sentence-Level Language Model for Sentence Ordering

08/24/2021
by   Melika Golestani, et al.
0

Discovering the logical sequence of events is one of the cornerstones in Natural Language Understanding. One approach to learn the sequence of events is to study the order of sentences in a coherent text. Sentence ordering can be applied in various tasks such as retrieval-based Question Answering, document summarization, storytelling, text generation, and dialogue systems. Furthermore, we can learn to model text coherence by learning how to order a set of shuffled sentences. Previous research has relied on RNN, LSTM, and BiLSTM architecture for learning text language models. However, these networks have performed poorly due to the lack of attention mechanisms. We propose an algorithm for sentence ordering in a corpus of short stories. Our proposed method uses a language model based on Universal Transformers (UT) that captures sentences' dependencies by employing an attention mechanism. Our method improves the previous state-of-the-art in terms of Perfect Match Ratio (PMR) score in the ROCStories dataset, a corpus of nearly 100K short human-made stories. The proposed model includes three components: Sentence Encoder, Language Model, and Sentence Arrangement with Brute Force Search. The first component generates sentence embeddings using SBERT-WK pre-trained model fine-tuned on the ROCStories data. Then a Universal Transformer network generates a sentence-level language model. For decoding, the network generates a candidate sentence as the following sentence of the current sentence. We use cosine similarity as a scoring function to assign scores to the candidate embedding and the embeddings of other sentences in the shuffled set. Then a Brute Force Search is employed to maximize the sum of similarities between pairs of consecutive sentences.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
08/26/2021

A New Sentence Ordering Method Using BERT Pretrained Model

Building systems with capability of natural language understanding (NLU)...
research
08/14/2019

Scalable Attentive Sentence-Pair Modeling via Distilled Sentence Embedding

Attention based models have become the new state-of-the-art in natural l...
research
09/18/2020

Hierarchical GPT with Congruent Transformers for Multi-Sentence Language Models

We report a GPT-based multi-sentence language model for dialogue generat...
research
04/14/2021

Reformulating Sentence Ordering as Conditional Text Generation

The task of organizing a shuffled set of sentences into a coherent text ...
research
11/10/2022

Assistive Completion of Agrammatic Aphasic Sentences: A Transfer Learning Approach using Neurolinguistics-based Synthetic Dataset

Damage to the inferior frontal gyrus (Broca's area) can cause agrammatic...
research
07/23/2016

Neural Sentence Ordering

Sentence ordering is a general and critical task for natural language ge...
research
10/24/2021

Sentence Punctuation for Collaborative Commentary Generation in Esports Live-Streaming

To solve the existing sentence punctuation problem for collaborative com...

Please sign up or login with your details

Forgot password? Click here to reset