Putting words in context: LSTM language models and lexical ambiguity

06/12/2019
by   Laura Aina, et al.
0

In neural network models of language, words are commonly represented using context-invariant representations (word embeddings) which are then put in context in the hidden layers. Since words are often ambiguous, representing the contextually relevant information is not trivial. We investigate how an LSTM language model deals with lexical ambiguity in English, designing a method to probe its hidden representations for lexical and contextual information about words. We find that both types of information are represented to a large extent, but also that there is room for improvement for contextual information.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/05/2020

Speakers Fill Lexical Semantic Gaps with Context

Lexical ambiguity is widespread in language, allowing for the reuse of e...
research
10/13/2016

Compressing Neural Language Models by Sparse Word Representations

Neural networks are among the state-of-the-art techniques for language m...
research
11/03/2022

Contextual information integration for stance detection via cross-attention

Stance detection deals with the identification of an author's stance tow...
research
03/11/2022

When classifying grammatical role, BERT doesn't care about word order... except when it matters

Because meaning can often be inferred from lexical semantics alone, word...
research
03/06/2020

NYTWIT: A Dataset of Novel Words in the New York Times

We present the New York Times Word Innovation Types dataset, or NYTWIT, ...
research
03/30/2021

Representing ELMo embeddings as two-dimensional text online

We describe a new addition to the WebVectors toolkit which is used to se...
research
03/10/2022

Contextualized Sensorimotor Norms: multi-dimensional measures of sensorimotor strength for ambiguous English words, in context

Most large language models are trained on linguistic input alone, yet hu...

Please sign up or login with your details

Forgot password? Click here to reset