Exploring phrase-compositionality in skip-gram models

07/21/2016
by   Xiaochang Peng, et al.
0

In this paper, we introduce a variation of the skip-gram model which jointly learns distributed word vector representations and their way of composing to form phrase embeddings. In particular, we propose a learning procedure that incorporates a phrase-compositionality function which can capture how we want to compose phrases vectors from their component word vectors. Our experiments show improvement in word and phrase similarity tasks as well as syntactic tasks like dependency parsing using the proposed joint models.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
06/18/2015

"The Sum of Its Parts": Joint Learning of Word and Phrase Representations with Autoencoders

Recently, there has been a lot of effort to represent words in continuou...
research
10/21/2022

Describing Sets of Images with Textual-PCA

We seek to semantically describe a set of images, capturing both the att...
research
02/15/2020

Supervised Phrase-boundary Embeddings

We propose a new word embedding model, called SPhrase, that incorporates...
research
10/16/2013

Distributed Representations of Words and Phrases and their Compositionality

The recently introduced continuous Skip-gram model is an efficient metho...
research
09/13/2021

Phrase-BERT: Improved Phrase Embeddings from BERT with an Application to Corpus Exploration

Phrase representations derived from BERT often do not exhibit complex ph...
research
03/28/2018

Handling Verb Phrase Anaphora with Dependent Types and Events

This paper studies how dependent typed events can be used to treat verb ...
research
10/08/2020

Assessing Phrasal Representation and Composition in Transformers

Deep transformer models have pushed performance on NLP tasks to new limi...

Please sign up or login with your details

Forgot password? Click here to reset