Supervised Phrase-boundary Embeddings

02/15/2020
by   Manni Singh, et al.
0

We propose a new word embedding model, called SPhrase, that incorporates supervised phrase information. Our method modifies traditional word embeddings by ensuring that all target words in a phrase have exactly the same context. We demonstrate that including this information within a context window produces superior embeddings for both intrinsic evaluation tasks and downstream extrinsic tasks.

READ FULL TEXT
research
05/12/2016

On the Convergent Properties of Word Embedding Methods

Do word embeddings converge to learn similar things over different initi...
research
05/08/2017

Ontology-Aware Token Embeddings for Prepositional Phrase Attachment

Type-level word embeddings use the same set of parameters to represent a...
research
07/19/2022

PiC: A Phrase-in-Context Dataset for Phrase Understanding and Semantic Search

Since BERT (Devlin et al., 2018), learning contextualized word embedding...
research
11/22/2019

Topical Phrase Extraction from Clinical Reports by Incorporating both Local and Global Context

Making sense of words often requires to simultaneously examine the surro...
research
07/21/2016

Exploring phrase-compositionality in skip-gram models

In this paper, we introduce a variation of the skip-gram model which joi...
research
02/27/2019

Still a Pain in the Neck: Evaluating Text Representations on Lexical Composition

Building meaningful phrase representations is challenging because phrase...
research
12/01/2020

Intrinsic analysis for dual word embedding space models

Recent word embeddings techniques represent words in a continuous vector...

Please sign up or login with your details

Forgot password? Click here to reset