Universal Sentence Encoder

by   Daniel Cer, et al.

We present models for encoding sentences into embedding vectors that specifically target transfer learning to other NLP tasks. The models are efficient and result in accurate performance on diverse transfer tasks. Two variants of the encoding models allow for trade-offs between accuracy and compute resources. For both variants, we investigate and report the relationship between model complexity, resource consumption, the availability of transfer task training data, and task performance. Comparisons are made with baselines that use word level transfer learning via pretrained word embeddings as well as baselines do not use any transfer learning. We find that transfer learning using sentence embeddings tends to outperform word level transfer. With transfer learning via sentence embeddings, we observe surprisingly good performance with minimal amounts of supervised training data for a transfer task. We obtain encouraging results on Word Embedding Association Tests (WEAT) targeted at detecting model bias. Our pre-trained sentence encoding models are made freely available for download and on TF Hub.


page 1

page 2

page 3

page 4


Supervised Learning of Universal Sentence Representations from Natural Language Inference Data

Many modern NLP systems rely on word embeddings, previously trained in a...

Actionable Phrase Detection using NLP

Actionable sentences are terms that, in the most basic sense, imply the ...

MITRE at SemEval-2016 Task 6: Transfer Learning for Stance Detection

We describe MITRE's submission to the SemEval-2016 Task 6, Detecting Sta...

Comparative Evaluation of Pretrained Transfer Learning Models on Automatic Short Answer Grading

Automatic Short Answer Grading (ASAG) is the process of grading the stud...

Learning Robust, Transferable Sentence Representations for Text Classification

Despite deep recurrent neural networks (RNNs) demonstrate strong perform...

Towards Compute-Optimal Transfer Learning

The field of transfer learning is undergoing a significant shift with th...

SNU_IDS at SemEval-2018 Task 12: Sentence Encoder with Contextualized Vectors for Argument Reasoning Comprehension

We present a novel neural architecture for the Argument Reasoning Compre...

Code Repositories


A supervised universal sentence encoder trained through multitask learning.

view repo

Please sign up or login with your details

Forgot password? Click here to reset