Ontology-Aware Token Embeddings for Prepositional Phrase Attachment

05/08/2017
by   Pradeep Dasigi, et al.
0

Type-level word embeddings use the same set of parameters to represent all instances of a word regardless of its context, ignoring the inherent lexical ambiguity in language. Instead, we embed semantic concepts (or synsets) as defined in WordNet and represent a word token in a particular context by estimating a distribution over relevant semantic concepts. We use the new, context-sensitive embeddings in a model for predicting prepositional phrase(PP) attachments and jointly learn the concept embeddings and model parameters. We show that using context-sensitive embeddings improves the accuracy of the PP attachment model by 5.4 reduction in errors.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/15/2020

Supervised Phrase-boundary Embeddings

We propose a new word embedding model, called SPhrase, that incorporates...
research
06/09/2017

Learning to Embed Words in Context for Syntactic Tasks

We present models for embedding words in the context of surrounding word...
research
10/21/2022

Discovering Differences in the Representation of People using Contextualized Semantic Axes

A common paradigm for identifying semantic differences across social and...
research
06/10/2018

Unsupervised Disambiguation of Syncretism in Inflected Lexicons

Lexical ambiguity makes it difficult to compute various useful statistic...
research
04/30/2020

UiO-UvA at SemEval-2020 Task 1: Contextualised Embeddings for Lexical Semantic Change Detection

We apply contextualised word embeddings to lexical semantic change detec...
research
05/29/2023

Vec2Gloss: definition modeling leveraging contextualized vectors with Wordnet gloss

Contextualized embeddings are proven to be powerful tools in multiple NL...
research
11/29/2016

Geometry of Compositionality

This paper proposes a simple test for compositionality (i.e., literal us...

Please sign up or login with your details

Forgot password? Click here to reset