Attentive Mimicking: Better Word Embeddings by Attending to Informative Contexts

04/02/2019
by   Timo Schick, et al.
0

Learning high-quality embeddings for rare words is a hard problem because of sparse context information. Mimicking (Pinter et al., 2017) has been proposed as a solution: given embeddings learned by a standard algorithm, a model is first trained to reproduce embeddings of frequent words from their surface form and then used to compute embeddings for rare words. In this paper, we introduce attentive mimicking: the mimicking model is given access not only to a word's surface form, but also to all available contexts and learns to attend to the most informative and reliable contexts for computing an embedding. In an evaluation on four tasks, we show that attentive mimicking outperforms previous work for both rare and medium-frequency words. Thus, compared to previous work, attentive mimicking improves embeddings for a much larger part of the vocabulary, including the medium-frequency range.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/16/2019

BERTRAM: Improved Word Embeddings Have Big Impact on Contextualized Model Performance

Pretraining deep contextualized representations using an unsupervised la...
research
04/06/2016

An Ensemble Method to Produce High-Quality Word Embeddings

A currently successful approach to computational semantics is to represe...
research
06/01/2017

Learning to Compute Word Embeddings On the Fly

Words in natural language follow a Zipfian distribution whereby some wor...
research
09/07/2021

Rare Words Degenerate All Words

Despite advances in neural network language model, the representation de...
research
04/14/2019

Rare Words: A Major Problem for Contextualized Embeddings And How to Fix it by Attentive Mimicking

Pretraining deep neural network architectures with a language modeling o...
research
08/28/2018

Card-660: Cambridge Rare Word Dataset - a Reliable Benchmark for Infrequent Word Representation Models

Rare word representation has recently enjoyed a surge of interest, owing...
research
10/20/2018

pair2vec: Compositional Word-Pair Embeddings for Cross-Sentence Inference

Reasoning about implied relationships (e.g. paraphrastic, common sense, ...

Please sign up or login with your details

Forgot password? Click here to reset