Context-Attentive Embeddings for Improved Sentence Representations

04/21/2018
by   Douwe Kiela, et al.
0

While one of the first steps in many NLP systems is selecting what embeddings to use, we argue that such a step is better left for neural networks to figure out by themselves. To that end, we introduce a novel, straightforward yet highly effective method for combining multiple types of word embeddings in a single model, leading to state-of-the-art performance within the same model class on a variety of tasks. We subsequently show how the technique can be used to shed new insight into the usage of word embeddings in NLP systems.

READ FULL TEXT
research
10/06/2016

Neural-based Noise Filtering from Word Embeddings

Word embeddings have been demonstrated to benefit NLP tasks impressively...
research
04/16/2021

Word2rate: training and evaluating multiple word embeddings as statistical transitions

Using pretrained word embeddings has been shown to be a very effective w...
research
05/27/2022

Semeval-2022 Task 1: CODWOE – Comparing Dictionaries and Word Embeddings

Word embeddings have advanced the state of the art in NLP across numerou...
research
05/23/2019

Misspelling Oblivious Word Embeddings

In this paper we present a method to learn word embeddings that are resi...
research
09/03/2019

Interpretable Word Embeddings via Informative Priors

Word embeddings have demonstrated strong performance on NLP tasks. Howev...
research
11/29/2016

Geometry of Compositionality

This paper proposes a simple test for compositionality (i.e., literal us...
research
05/29/2019

ATTACK2VEC: Leveraging Temporal Word Embeddings to Understand the Evolution of Cyberattacks

Despite the fact that cyberattacks are constantly growing in complexity,...

Please sign up or login with your details

Forgot password? Click here to reset