Lexical semantics enhanced neural word embeddings

10/03/2022
by   Dongqiang Yang, et al.
0

Current breakthroughs in natural language processing have benefited dramatically from neural language models, through which distributional semantics can leverage neural data representations to facilitate downstream applications. Since neural embeddings use context prediction on word co-occurrences to yield dense vectors, they are inevitably prone to capture more semantic association than semantic similarity. To improve vector space models in deriving semantic similarity, we post-process neural word embeddings through deep metric learning, through which we can inject lexical-semantic relations, including syn/antonymy and hypo/hypernymy, into a distributional space. We introduce hierarchy-fitting, a novel semantic specialization approach to modelling semantic similarity nuances inherently stored in the IS-A hierarchies. Hierarchy-fitting attains state-of-the-art results on the common- and rare-word benchmark datasets for deriving semantic similarity from neural word embeddings. It also incorporates an asymmetric distance function to specialize hypernymy's directionality explicitly, through which it significantly improves vanilla embeddings in multiple evaluation tasks of detecting hypernymy and directionality without negative impacts on semantic similarity judgement. The results demonstrate the efficacy of hierarchy-fitting in specializing neural embeddings with semantic relations in late fusion, potentially expanding its applicability to aggregating heterogeneous data and various knowledge resources for learning multimodal semantic spaces.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
05/20/2020

Enhancing Word Embeddings with Knowledge Extracted from Lexical Resources

In this work, we present an effective method for semantic specialization...
research
09/30/2022

Synonym Detection Using Syntactic Dependency And Neural Embeddings

Recent advances on the Vector Space Model have significantly improved so...
research
09/03/2018

Affordance Extraction and Inference based on Semantic Role Labeling

Common-sense reasoning is becoming increasingly important for the advanc...
research
09/30/2022

Evaluation of taxonomic and neural embedding methods for calculating semantic similarity

Modelling semantic similarity plays a fundamental role in lexical semant...
research
07/11/2016

Mapping distributional to model-theoretic semantic spaces: a baseline

Word embeddings have been shown to be useful across state-of-the-art sys...
research
02/27/2021

EDS-MEMBED: Multi-sense embeddings based on enhanced distributional semantic structures via a graph walk over word senses

Several language applications often require word semantics as a core par...
research
05/20/2021

A comprehensive comparative evaluation and analysis of Distributional Semantic Models

Distributional semantics has deeply changed in the last decades. First, ...

Please sign up or login with your details

Forgot password? Click here to reset