Concept Embedding for Information Retrieval

02/01/2020
by   Karam Abdulahhad, et al.
0

Concepts are used to solve the term-mismatch problem. However, we need an effective similarity measure between concepts. Word embedding presents a promising solution. We present in this study three approaches to build concepts vectors based on words vectors. We use a vector-based measure to estimate inter-concepts similarity. Our experiments show promising results. Furthermore, words and concepts become comparable. This could be used to improve conceptual indexing process.

READ FULL TEXT

page 1

page 2

page 3

page 4

research
02/12/2017

Vector Embedding of Wikipedia Concepts and Entities

Using deep learning for different machine learning tasks such as image c...
research
01/11/2018

Enhancing Translation Language Models with Word Embedding for Information Retrieval

In this paper, we explore the usage of Word Embedding semantic resources...
research
12/22/2017

Novel Ranking-Based Lexical Similarity Measure for Word Embedding

Distributional semantics models derive word space from linguistic items ...
research
01/31/2019

Learning Taxonomies of Concepts and not Words using Contextualized Word Representations: A Position Paper

Taxonomies are semantic hierarchies of concepts. One limitation of curre...
research
06/21/2019

Learning as the Unsupervised Alignment of Conceptual Systems

Concept induction requires the extraction and naming of concepts from no...
research
12/17/2021

Expedition: A System for the Unsupervised Learning of a Hierarchy of Concepts

We present a system for bottom-up cumulative learning of myriad concepts...
research
05/15/2023

A Crosslingual Investigation of Conceptualization in 1335 Languages

Languages differ in how they divide up the world into concepts and words...

Please sign up or login with your details

Forgot password? Click here to reset