A Survey on Contextual Embeddings

03/16/2020
by   Qi Liu, et al.
38

Contextual embeddings, such as ELMo and BERT, move beyond global word representations like Word2Vec and achieve ground-breaking performance on a wide range of natural language processing tasks. Contextual embeddings assign each word a representation based on its context, thereby capturing uses of words across varied contexts and encoding knowledge that transfers across languages. In this survey, we review existing contextual embedding models, cross-lingual polyglot pre-training, the application of contextual embeddings in downstream tasks, model compression, and model analyses.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/15/2017

A Survey Of Cross-lingual Word Embedding Models

Cross-lingual representations of words enable us to reason about word me...
research
08/13/2020

MICE: Mining Idioms with Contextual Embeddings

Idiomatic expressions can be problematic for natural language processing...
research
03/05/2020

BERT as a Teacher: Contextual Embeddings for Sequence-Level Reward

Measuring the quality of a generated sequence against a set of reference...
research
06/05/2020

Sentence Compression as Deletion with Contextual Embeddings

Sentence compression is the task of creating a shorter version of an inp...
research
09/18/2020

Will it Unblend?

Natural language processing systems often struggle with out-of-vocabular...
research
06/15/2019

Context is Key: Grammatical Error Detection with Contextual Word Representations

Grammatical error detection (GED) in non-native writing requires systems...

Please sign up or login with your details

Forgot password? Click here to reset