Syntree2Vec - An algorithm to augment syntactic hierarchy into word embeddings

08/14/2018
by   Shubham Bhardwaj, et al.
0

Word embeddings aims to map sense of the words into a lower dimensional vector space in order to reason over them. Training embeddings on domain specific data helps express concepts more relevant to their use case but comes at a cost of accuracy when data is less. Our effort is to minimise this by infusing syntactic knowledge into the embeddings. We propose a graph based embedding algorithm inspired from node2vec. Experimental results have shown that our algorithm improves the syntactic strength and gives robust performance on meagre data.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
07/04/2015

AutoExtend: Extending Word Embeddings to Embeddings for Synsets and Lexemes

We present AutoExtend, a system to learn embeddings for synsets and lexe...
research
01/21/2020

Generating Sense Embeddings for Syntactic and Semantic Analogy for Portuguese

Word embeddings are numerical vectors which can represent words or conce...
research
05/18/2021

WOVe: Incorporating Word Order in GloVe Word Embeddings

Word vector representations open up new opportunities to extract useful ...
research
08/21/2017

Probabilistic Relation Induction in Vector Space Embeddings

Word embeddings have been found to capture a surprisingly rich amount of...
research
11/13/2021

Keyphrase Extraction Using Neighborhood Knowledge Based on Word Embeddings

Keyphrase extraction is the task of finding several interesting phrases ...
research
02/20/2021

Knowledge-Base Enriched Word Embeddings for Biomedical Domain

Word embeddings have been shown adept at capturing the semantic and synt...
research
03/31/2022

A bilingual approach to specialised adjectives through word embeddings in the karstology domain

We present an experiment in extracting adjectives which express a specif...

Please sign up or login with your details

Forgot password? Click here to reset