Discourse-Based Objectives for Fast Unsupervised Sentence Representation Learning

04/23/2017
by   Yacine Jernite, et al.
0

This work presents a novel objective function for the unsupervised training of neural network sentence encoders. It exploits signals from paragraph-level discourse coherence to train these models to understand text. Our objective is purely discriminative, allowing us to train models many times faster than was possible under prior methods, and it yields models which perform well in extrinsic evaluations.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2020

Pretraining with Contrastive Sentence Objectives Improves Discourse Performance of Language Models

Recent models for unsupervised representation learning of text have empl...
research
03/28/2019

Mining Discourse Markers for Unsupervised Sentence Representation Learning

Current state of the art systems in NLP heavily rely on manually annotat...
research
08/31/2019

Evaluation Benchmarks and Learning Criteriafor Discourse-Aware Sentence Representations

Prior work on pretrained sentence embeddings and benchmarks focus on the...
research
10/12/2017

DisSent: Sentence Representation Learning from Explicit Discourse Relations

Sentence vectors represent an appealing approach to meaning: learn an em...
research
05/17/2022

Efficient Unsupervised Sentence Compression by Fine-tuning Transformers with Reinforcement Learning

Sentence compression reduces the length of text by removing non-essentia...
research
10/13/2020

Corruption Is Not All Bad: Incorporating Discourse Structure into Pre-training via Corruption for Essay Scoring

Existing approaches for automated essay scoring and document representat...
research
05/25/2023

Extracting Text Representations for Terms and Phrases in Technical Domains

Extracting dense representations for terms and phrases is a task of grea...

Please sign up or login with your details

Forgot password? Click here to reset