Synonym Detection Using Syntactic Dependency And Neural Embeddings

09/30/2022
by   Dongqiang Yang, et al.
0

Recent advances on the Vector Space Model have significantly improved some NLP applications such as neural machine translation and natural language generation. Although word co-occurrences in context have been widely used in counting-/predicting-based distributional models, the role of syntactic dependencies in deriving distributional semantics has not yet been thoroughly investigated. By comparing various Vector Space Models in detecting synonyms in TOEFL, we systematically study the salience of syntactic dependencies in accounting for distributional similarity. We separate syntactic dependencies into different groups according to their various grammatical roles and then use context-counting to construct their corresponding raw and SVD-compressed matrices. Moreover, using the same training hyperparameters and corpora, we study typical neural embeddings in the evaluation. We further study the effectiveness of injecting human-compiled semantic knowledge into neural embeddings on computing distributional similarity. Our results show that the syntactically conditioned contexts can interpret lexical semantics better than the unconditioned ones, whereas retrofitting neural embeddings with semantic knowledge can significantly improve synonym detection.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
10/03/2022

Lexical semantics enhanced neural word embeddings

Current breakthroughs in natural language processing have benefited dram...
research
07/13/2016

A Vector Space for Distributional Semantics for Entailment

Distributional semantics creates vector-space representations that captu...
research
09/30/2022

Evaluation of taxonomic and neural embedding methods for calculating semantic similarity

Modelling semantic similarity plays a fundamental role in lexical semant...
research
06/26/2016

Functional Distributional Semantics

Vector space models have become popular in distributional semantics, des...
research
08/19/2019

Why So Down? The Role of Negative (and Positive) Pointwise Mutual Information in Distributional Semantics

In distributional semantics, the pointwise mutual information (PMI) weig...
research
03/06/2020

Distributional semantic modeling: a revised technique to train term/word vector space models applying the ontology-related approach

We design a new technique for the distributional semantic modeling with ...
research
04/19/2017

Redefining Context Windows for Word Embedding Models: An Experimental Study

Distributional semantic models learn vector representations of words thr...

Please sign up or login with your details

Forgot password? Click here to reset